Web Developer's Guide to Docker

By Martin Rusev read

I have been following the development of Docker since version 0.6 (August 2013) and using it in production for the last 6 months. I must admit that even though I have a lot of experience in DevOps territory it took me some time to really understand what Docker is all about. This post's goal is to explain Docker from an average web developer perspective and try to show some useful ways to optimize your workflow.

What is Docker

Docker is a tool for running isolated containers on Linux. Depending on your style you can run a single service in a container like Apache or MySQL or whole application stacks like Rails or MEAN(MongoDB, Angular, NodeJS). There is no right or wrong way - the Docker purists recommend having one process per container, but this is something hard to manage in practice.

Unlike traditional VMs which take minutes to boot, a Docker container will take under a second. Docker containers are isolated, but not completely - they share the same kernel and memory space with the Host OS. There is no memory overhead for running containers. Running 100 containers is almost the same like running 100 processes.

The Docker community is already curating and cultivating generic containers that anyone can use as starting points. You can choose from over 14 000 ready to go containers with popular applications like MySQL, Wordpress, MongoDB, Ruby on Rails at the Docker Hub.If you want to run MySQL for example, you can download it from the Hub and run it with these two lines of code:

docker pull mysql
docker run --name user -e MYSQL_ROOT_PASSWORD=pass -d mysql

Docker requires Linux, but you can use it on OS X or Windows machine with Vagrant or Boot2Docker. Both tools require VirtualBox, so it is a matter of personal preference. If you choose Vagrant running Docker could be something as simple as:

Vagrant.configure("2") do |config|
    config.vm.provision "docker",
        images: ["ubuntu"]

Docker uses a simple scripting language to automate creation of containers. It has a very basic set of commands that you can use to expose ports, mount volumes and run system commands. To build a container you put these commands in a Dockerfile (like a Makefile or a Vagrantfile) and then run docker build .

# Apache2 container
FROM ubuntu:12.04
RUN apt-get update && apt-get install -y apache2 && apt-get clean

ENV APACHE_LOG_DIR /var/log/apache2

CMD ["/usr/sbin/apache2", "-D", "FOREGROUND"]

Docker for web development

Dev environments

It is always a challenge to create a local environment which replicates the production environment. Minimizing the differences between dev and production help us avoid last minute deployment issues like missing dependencies or configuration differences. Previously I would solve these issues by running Vagrant on my machine, but if you have to do vagrant up for 5-6 machines it could take a lot of time. Booting containers in Docker is orders of magnitude faster. As an example booting 3 PostgreSQL containers(master+2 slaves) takes around 5-6 seconds on an average laptop.

Safe sandboxing

Want to install a new NoSQL database or try out various libraries from questionable sources. Docker is the way to go, you install the library in a container, play around and then you throw it away. Docker is quite secure, but to be completely safe - make sure that there are no libraries running as root in you containers.


Using the same image on dev and in production is one of the major Docker key selling points. You snapshot a container into a image, use it on your dev box and then deploy the same image in production. In theory deployment could not get easier than this.

Right now managing containers in production requires an additional amount of ops work - especially if the containers depend on each other (Master DB+Slave). I believe that this will improve soon, until then you can use some of the popular PaaS built on top of Docker like

Unit testing

If you are working on a web application there is a big chance that you have some API integrations with other web services (Github/Facebook/Twitter login for example). You can use Mocks to simulate test responses, but if the provider changes the API, you will never know and your mock tests will still pass. The best way to avoid problems is to test against the real API and these tests usually tend to run very slow. With Docker you can dramatically speed up your test suite by separating API calls in different containers. For example, you can have something like this:

declare -a arr=(twitter, facebppl, braintree, github)

for i in ${arr[@]}
    docker run {container_vars} python manage.py test integrations/$i &


Well these are just some of the use cases. In future weeks I plan to go into more details about the specifics and share how I improved my personal dev workflow with Docker. If you are a web developer or devops guy I will really apreciate if you take a couple of seconds and share your experience with Docker in the comment section below.