Why I’m Excited About Docker


Docker is already being used in production by big companies, has received a tremendous amount of publicity, and continues to spur excitement in the tech industry. For a project that is only two years old, this success is unprecedented.

In the first post of this miniseries, I’d like to share the reasons behind my excitement about Docker and explain why I think Docker has made my work as a developer easier. In the second post in this miniseries, I will summarise what I learnt from the interesting announcements and demos at DockerCon, which took place in San Francisco in June.

My Encounter With Docker

Docker is a relatively new technology. In two years of existence, it has attracted a great deal of publicity and excitement. And many believe, as I do, that it’s going to shape the industry for the years to come.

I came across this technology six months ago when, at a Python meetup, I saw another developer wearing a Docker t-shirt. After the meetup, I went ahead and looked up “Docker” and discovered the vibe and dynamism behind this community.

As a freelance developer and a former network and system administrator, I’m really excited by the possibilities made possible by Docker. I remember how challenging and time consuming it used to be, setting up your own development environment, testing various alternative environment configuration scenarios, reproducing production environments on a local Linux box, setting up your production environment on a VPS and deploying applications in different ways depending on your VPS provider, and so on.

These operations were repetitive, error prone, and time consuming. One could spend an entire week—or even more—just working on these operational tasks that were far removed from the real business of your application.

Docker changed all this in a dramatic way.

And my evaluation is coming from the perspective of a freelance developer who only deals with small projects most of the time. For large scale or enterprise operations, Docker represents even a greater step forward in terms of higher velocity and more unified, integrated, repeatable, and reliable process for building, running, and delivering software.

Docker Features I Am Excited About

There are many features of Docker that makes developers, and ops people alike, happier in their jobs. But these are some of the features that I appreciate.


Docker solves the runtime problem by offering repeatable and declarative environments which are defined in a file called Dockerfile, or docker-compose.yml in case of a multi-containers application.

Using these definitions, Docker is able to bring to life your environment anywhere you want to run it. You build your environment once, but can run it everywhere. For an overview of docker compose, check out the tutorial.


Docker makes it easier to play with various technologies. With the Docker hub, ready-made containers are good to go. You can easily play with tools such as Hadoop, Redis, ElasticSearch, RabbitMQ, and so on. You can learn how they behave in a specific constellation and just destroy the container after you’re done with experimenting.

This kind of exercise used to require a lot of time and infrastructure. With a lightweight container like Docker, it’s now a trivial operation. The increased capacity to learn and experiment with new technologies in containers can be transformational from a creative development perspective.

Service Independence

With major cloud providers partnering with Docker in the provision of container services in the cloud, developers have increased opportunities and a broader spectrum of choices of cloud services.

Moving from one cloud provider to another becomes less painful as more components of the container ecosystem become standardised. Not to mention the standard cost benefits of swapping amortized physical hardware costs for the cloud pay for what you use model, meaning your app scales down in cost as it scales down in traffic.

Improved OSS Experience

I take pleasure in getting open source applications up and running, seeing how they run, learning from them, and seeing what can be reused.

A stumbling block has always been how difficult it is to assemble the services or micro-components that the application relies on. With Docker, open source project maintainers are sharing not only the application code, but also a pre-made running environment as defined in the project’s Dockerfile.

Traditionally, open source applications have a bad reputation for taking hours, or even days, to get properly set up. But with Docker, it is just a matter of building and running the Docker image. This represents a great step forward for ease of use and contributor onboarding.

If you want a feel of how easy this workflow is, check out the youtube-audio-dl project on GitHub. The README has instructions for getting started with Docker and Docker Compose.

This reproducibility also has a knock-on effect for collaboration and bug squashing. Because everyone receives not only the application code, but also the environment it runs on. So the usual excuse “it works on my machine” is something of the past.

Development Model

Another benefit of Docker is that it encourages you to shift your thinking towards new and improved development models for the cloud. For example, 12 Factor Applications. Also, modularized architectures, i.e. microservices.

Following the 12 Factor Application model, your app is going to be built for the cloud (and the associated cost, scaling, availability, and fault-tolerance models) right from the start.

Developing with microservices can open up many opportunities. For example, it allows you to develop each part of your application with the stack that is suitable for the job. Not only you will make your job easier, but you also separate out different parts of your app that can be worked on separately by separate people who have different skill sets.

How Does Docker Affect My Development Workflow?

With a Docker powered development environment, it is no longer necessary to set up a different virtual environment with virtualenv (for a Python environment) tool in order to isolate applications packages with system modules.

Whether it is a Flask or a Django application that I am working on, the description inserted in the Dockerfile (for the image) and the requirements.txt (for the necessary modules) will ensure that I am running the right stack in an isolated manner.

The easiness with which one is able to fire up a new development environment, trying it out, change it, destroy it and redesign a new one is just amazing. As a developer, I spend more of my time attending to the real business of the application, instead of troubleshooting installation issues.

With the introduction of Docker in my development environment, the workflow and commands remain generally unchanged. There’s no need to learn a totally new vocabulary to do the development work.

For instance, to fire up a new Rails application, I run:

$ docker-compose run web rails new . --force --database=postgresql --skip-bundle

To create a database:

$ docker-compose run web rake db:create

The same experience applies to the Django framework.

To start a new project, I run:

$ docker-compose run web django-admin.py startproject blog .

And to set up my database and run migrations, I run:

$ docker-compose run web python manage.py syncdb

To a Rails or Django developer, these commands should look very familiar!

This contributes to making the development process under a containerised development environment smooth and easy to adapt to.

There are of course still issues that surface with ruby on rails. For instance, whenever your Gemfile changes maybe because you have added a new gem to be installed, bundle pulls not only the new gem, but all the gems in your gemfile. This isn’t the most efficient way to set up the development environment. However, various approaches have been suggested in order to cache bundle install with Docker.

For more information on how to setup Docker Compose in a multi-container development environment, check out the Django docs or the Rails docs.

Finally, running my database driven web application in multiple Docker containers using Docker Compose presents me with the flexibility and scalability potentials I did not have before. My web and database layers can evolve and scale on different paths.


Docker is very exciting.

Innovations brought about by docker are likely to have an enduring impact on how we develop and deploy applications.

As an individual freelance developer, several things make my life easier:

  • Better, easier runtimes
  • Increased ability to experiment with new technologies
  • Increased independence of service providers
  • Enhances OSS use and collaboration
  • Shifts towards improved development models

Docker also works well with frameworks like Django and Rails and the Docker Compose commands integrate well with the existing commands from these frameworks.

In part two of this mini series, I will share with you some highlights of what I saw at DockerCon and explain why I am excited by these developments.

This article was previously published with the consent of the author on Deis blog .

About the Author

Dr Mazembo Mavungu Eddy is a Tech and Social Science Writer, Developer, Senior Research Consultant and Founder of Dielais. He lives in France and provides training and consultancy on Docker and related container technologies. Get in touch by writing to mazemb_eddy at yahoo.fr or dielainfos at gmail.com. Follow me on Twitter: @mazembo