Using Docker to deploy complex stacks mixing Python code and non-Python dependencies
Ken Cochrane
- Audience level:
- Novice
- Category:
- Best Practices & Patterns
Description
Since Docker was presented at PyCon 2013, a lot has happened. Docker containers are now frequently used to ship software stacks, simple or complex, from development, to staging, to production, and everything in between. We will show how we used Docker to "containerize" our python project, to improve our application development and speed up our testing and deployment.
Abstract
Our application is a typical Django application, and has lots of dependencies: Database, redis, elastic search, background workers, external web services, etc. When someone new joined the team it would take almost a full day to get them fully setup and running. Each component on their own wasn't a problem to get setup, but getting them all working together was.
Enter Docker. Each app, each database, each service, is "containerized", by using Dockerfiles. Dockerfiles are functionally similar to Puppet manifests or Chef cookbooks, but they are generally much easier to write and to maintain. The end result is a set of containers, which can easily be installed on any kind of environment. Setting up a new development environment is trivial, running integration or unit tests is also greatly simplified, and deployment to production is reproducible and reliable.
Docker has been compared with Vagrant, except that it is much more lightweight (due to the use of containers instead of VMs), and therefore more suited to large, complex stacks, which would otherwise require multiple VMs (and inappropriate amounts of resources).