Software applications are complex beasts. More complex than the average user may realise. Take the homepage for your favourite news site for example, or Amazon.co.uk.
When you log on and try to purchase a recent book or search for your favourite type of tea. You may think that all the details are being generated by magic and all the functionality is based on some mysterious and complex exchange of bits and bytes.
Which, is kinda true. I mean, the complexity is real. Less magic, but don’t tell software developers that.
In reality, you are only seeing the front end. What you’re not seeing is the back-end operations in which there can be API calls to fetch data, databases involved for storage. And various other operations to enable you to see the daily news when you request it.
These were (and still are) often all bundled as one enormous package. Put onto a server somewhere and then run from there. But deploying applications with this ‘monolithic’ architecture leads to several problems becoming apparent:
- Complexity issues
Taking into account all the various modules involved. The dependencies needed for those to function. And the understanding needed to maintain your application. It is apparent that the bigger the application means the more complex it can be,
- Single point of failure
More operations going on in the background to maintain operability of the system ultimately means that if one part goes wrong. Then the entire system is brought down with it.
- Updates can break compatibility if not properly tested
And a critical reason for intensive testing of these big web apps is that changes to dependency versions. Environment updates, or even slight modifications to the code. Can easily have unintended effects on other parts of the application.
And because of difficulty with getting consistent environments configured across workstations and servers. This often leads to a colossal headache for testers, developers and those involved with deployments.
Micro-services to the rescue
We now build many applications in a way that overcomes many of these known problems. By structuring applications in a way that separates each component into its own individual service, with the ability to communicate with others via a REST API. This enables developers to create applications with greater flexibility, scalability and being easier to maintain.
However, as there are now so many parts for an application to configure and deploy independently. Setting each one up for deployment in every environment in which they need it in, can be its own separate challenge.
This is where a program like Docker helps developers to not only gather all their applications modules, dependencies and libraries into a Docker image. But they can then send that image to anyone who needs to use it. Enabling them to generate a consistent environment inside a Docker container, no matter the OS they are using.
But isn’t that like a virtual machine?
Quick answer: Not quite.
VMs are their own separate layer of abstraction on top of the host’s operating system. Each instance has its own operating system, which it has to manage. And a thing called a hypervisor which is what allows VMs to be created and run on a machine or server.
Docker has distinct advantages here because it does not have a hypervisor in which it needs to manage. Instead, it has the Docker Deamon. What this does is allow Docker containers to be created and run on top of your operating system.
If you want to learn more about the Docker Deamon. I recommend checking out this post.
The advantage with this is that you are not waiting for a VM to boot up so you can use any applications that you have stored within it. You can instead instantly create a new Docker instance with a web server, database, or versions of headless browsers. And instantly start using them within a container.
How Docker helps testers
Let’s take a scenario in which you are testing a web application that has multiple dependencies. The server is in Node.JS, there’s a MongoDB database, and various libraries and packages baked in. With testers running different operating systems and being globally dispersed. If someone doesn’t have the correct version of a file for instance or has compatibility issues with a required process. Getting the application running on their local machine will be a headache.
This is where Docker can be a lifesaver for testers and developers, with the words: ‘It works on my machine‘ instantly nullified.
Environments can instantly be shared across teams, providing higher levels of consistency and compatibility. Enabling the setting up your test environment reduced to minutes, instead of hours with a single command.
Are there any negatives to using Docker?
Once you use Docker and applications within containers. You will wonder why you didn’t discover it sooner.
Docker first became available in 2013/14 and it took me a while to get to grips with the commands and everything being in its own separate container. Each with its own processes, networking and ability to be totally isolated through the use of name spaces.
Docker can be a lot of work to get to grips with. While downloading images and creating containers is fairly easy (thanks to the Docker Hub). Depending on what you want to do with it and your level of comfort with the command line. The complexity of using such a tool can seem daunting.
But don’t worry. There is a handy cheat sheet available here which I highly recommend you keep at hand for reference. It will remind you of the most important commands and enable you to take advantage of this amazing tool.
Where can I learn more?
You can learn more about the advantages of using Docker over on their homepage. I also recommend checking out this tutorial from freecodecamp which aims to get you up to speed and productive with Docker in no time.
There are many other advantages to using Docker from running automated tests and headless browsers within an isolated environment. To running applications that aren’t compatible with your machines operating system. And although there was once a time when things were simple and often ‘just worked’. That just isn’t always the case now for many programs and their associated processes. Tools like Docker break down the complexity barriers you may be facing. And allow for seamless operability across multiple people’s machines.