It's been less than two years since Docker sparked the trend in software containers. And since its modest presentation at PyCon in 2013, the startup has vaulted to a value of nearly 1 billion dollars, drawn 2,500 to DockerCon, and become a marketable skill to have, entering Hacker News' top 20 most frequently requested job skills.
It's raison d'etre is clear enough: "Docker is an open platform for developing, shipping, and running applications. Docker enables developers to develop on any virtual environment, such as Linux distribution, database back-end, etc, on their laptop.
This isn't exactly all that new; Google has used container technology since 2004. But Docker made the most of this technology by making it quicker and easier for developers to use. Not only does Docker simplify virtualization on the developer machine, but it also allows the use of that "virtual" infrastructure stack for other environments, whether testing or production. This fluidity has vastly shortened the time between writing code and seeing it in production.
Want to learn more about Docker's features? Check out Mano Marks's blog post, "10 content tools from the Docker community"
"Docker was not the first container technology out there," said Vidar Langseid, Lead Cloud Infrastructure Engineer at eZ, "but you could say Docker was the one who gave container technology to the people. Not because it was open source (others were too) but Docker suddenly made containers convenient and easier to work with."
Docker's lightweight containers also allow SysAdmins to scale their projects. They can quickly assemble applications from components and Docker helps eliminate the errors that may come when shipping code. Nearly everyone in an organization can easily understand how an application works because the containers are easy to build, iterative versions are quickly enabled, and the changes to each version are easy to spot. Docker containers run everywhere, from laptops to data centers and private or public clouds. Thus, applications can easily cycle between an individual computer, a testing environment or a cloud.
Why eZ loves Docker
Around 2013, eZ was looking for ways to simplify development, accelerate developer onboarding and facilitate collaboration by using the same technology across multiple systems inside the company. eZ considered some technologies like Puppet and Ansible, but André Rømcke, VP of Engineering at eZ, was on the lookout for other technologies and eventually found Docker.
"We were looking for a technology that we could use in all parts of our systems and then potentially in the future use in all our customer's systems," Andre said.
Andre began using Docker shortly after it was released. Since then, it's become an important part of several eZ developers' workflows, including Vidar's, who uses it on a daily basis.
"It's very cool to be able to create an environment based on any Linux distribution by issuing a simple Docker command," Vidar said. "This is great when you quickly want to test some new application knowing it won't be able to tamper with your system. It's easy to get rid of the application afterwards, too: just remove the container. On the professional side, Docker makes it possible to deploy and host web applications in a completely new way. It is really a game-changer."
Currently, eZ uses Docker for three internal purposes:
- Automated testing of new code (aka pull requests) done before new code is accepted and merged into the code repository
- A master of eZ Platform and eZ Studio that is deployed daily and used by Product Management to evaluate the latest changes to the platform
- Demo environments, which Sales and Product staff use to show customers and partners eZ's products in action
Not all eZ developers are using Docker (or aware of it), but when they commit code, Docker is used as an underlying infrastructure for automated testing and deployment. Within eZ, Docker is a highly useful tool in improving "deployability" of eZ products, and putting the solutions in the hands of users.