AUTO1 Group

Exploring technologies with Docker

By Marcin Sałata

Marcin is Senior Software Engineer at AUTO1 Group.

< Back to list
Coding Dec 19 2018

tl;dr

Using Docker Compose we are able to quickly set up complex environments consisting of multiple containers, which allows to speed up software evaluation process.

Intro

Things evolving quickly, tools come and go, last year cutting edge is often legacy now. Dozens of new tools emerges every month and approach for being up to date also needs to evolve.

Containers on production with all its benefits and challenges is a common thing nowadays, but do not forget that containers could also help us in other use cases. One of them is to improve technologies evaluation process.

Docker at home

Many moons ago when new fancy tool was released and looks so promising to give it a try options were limited - build it from sources, use installer or package manager. Giving a try for new promising tool was quite time consuming. Steps required to download, install, configure system and dependencies were usually not most exciting part of evaluating new message broker or database. Now containers or PaaS solutions allow us to shift focus directly from setup to features that particular products offers.

Level 1: Basic usage

Using Docker one terminal command is enough to download, run and evaluate interesting applications.

docker run postgres

Above example command will pull latest version of PostgreSQL image from Docker Hub and run it locally. This is a quick and straightforward way of running hundreds of interesting applications. Container will run in sandbox environment, so we do not need to think about operation system settings and other specific configuration that may be required with standard installation. Although it is a simpler way than using installers or building from sources, then comparing to common package managers where also one command is usually enough, user experience looks quite similar.

Level 2: Next step

The real benefits of using Docker in local environment we can observe if we would like to build some complex environment that contains multiple applications linked together as usually things are more complicated than just running a database or monitoring tool separately. Challenge often starts when we would like to evaluate a monitoring tool that needs a datasource and entire environment should be relatively easily to set up and manage. That is where Docker Compose comes to help.

As stated in official documentation Compose is a tool that allow us to define and run multi-container applications. Using it in most common scenarios comes down to creating YAML file with required images and configurations.

version: '3'
services:
  myService1:
    image: service1
    {configuration:}
  myService2:
    image: service2
    {configuration:}

Having docker-compose.yml file ready we are able to start it with one terminal command.

docker-compose up

Docker will build or download all necessary images, set up necessary configurations, networks and run defined containers for us.

Level 3: Hands-on experience

In below example we would like to see in action Grafana - open source monitoring tool with PostgreSQL as datasource.

Define PostgreSQL

First thing w need to do is to create empty docker-compose.yml file and add PostgreSQL as datasource.

version: '3'
services:
  myDatasource:
    image: postgres

Above we have defined one service called myDatasource that will be created using postgres image.

Define Grafana

Having our datasource configured we can extend our Compose file with second service - Grafana.

version: '3'
services:
  myDatasource:
    image: postgres
  myMonitoring:
    image: grafana/grafana
    ports:
      - 3000:3000

Above we have defined second service called myMonitoring that will be created using grafana/grafana image. Additionaly to basic service definition we have also exposed port 3000, so Grafana user interface can be access from outside the container.

Troubleshooting

At this point we have configured our environment, so by running docker-compose up two containers should be configured and run by Docker. Then navigating to localhost:3000 will allow us to access Grafana web interface where datasource could be set up. Thing that is worth notice is that if we would like to configure our PostgreSQL datasource with default suggested Host - localhost:5432

Datasource Config

We will end up with following error message:

dial tcp 127.0.0.1:5432: connect: connection refused

This is happening because Compose creates network for our application and inter container communication happens via services names. In our example using myDatasource:5432 as Host solves connection issue.

Further explorations

This relatively simple example consisting of two containers can be considered as a base for further journeys in world of containers. Public cloud providers offers multiple services like Amazon Elastic Container Service or Azure Kubernetes Service which can be utilised as next step in building container based environments.

Stories you might like:
CodingJul 2
By Nicholas Peretti

Create forms at scale with Formik and Yup

CodingJun 24
By Wojciech Oroński

Yet another case study of developing serverless apps with PHP.

CodingApr 4
By Chirag Swadia

How we use ES6 generators instead of thunk to simplify our React Redux application code and...