Integration testing for bunch of services with Pytest & Docker compose

Iuliia Volkova
9 min readMay 31, 2020

Hi! Today I want to share a guide on how to set up integration tests for a bunch of services. Last month I worked on a similar task and for some members of the team the result looked like something new. I thought that maybe it would make sense to write a quick guide about it.

Besides, I really need to write tests for https://github.com/xnuinside/gino-admin. Of course, I need a set of different tests, but today I will prepare a set up for tests, that will check examples are run correctly with new changes in “Gino Admin” code.

Introduction

In guide I will use

  • pytest-docker-compose
  • requests (we can easily use something asyncio httpx or aiohttp, but in this case no make sense, because I will not add any concurrency ro tests in this tutorial, so the run one-by-one)
  • docker & docker compose

You have a project with several microservices or services (does not matter in this context) and DB. And you need to run tests on this infrastructure, to be sure that components work together.

My Demo Project setup

For the tutorial I will take `fastapi_as_main_app` from Gino Admin examples, because it contains Web App on FastAPI, Admin panel & DB PostgreSQL. Source code of this tiny project here: https://github.com/xnuinside/gino-admin/tree/master/examples/fastapi_as_main_app

At the beginning we need 2 things:

1. up & run our services

2. be sure that they are see each other and can connect

Docker-Compose

If in 2020 you still don’t use/try docker-compose — just google tutorials about it and give it a chance. It’s really irreplaceable in case of several-services backends at least for dev and test envs.

It would be a bad idea to mix ‘take & play’ examples with tests, so I will put all files that needed for tests inside tests/ dir.

Let’s create a folder integration_tests/ and inside create a docker/ folder, where we will store all Dockerfiles that I will use for integration tests. Now I will create 2 Dockerfiles — one for Main App, one for Admin Panel. And I will have this structure in tests folder:

Tests folder structure

Docker context

Now, I need to pay attention to the Docker context. Because I need to have access to the examples/ folder in Dockerfile (and as you know docker does not allow access any paths outside Docker context) my Docker context will be а main gino_admin/ folder that contains ‘examples’.

So when I define Dockerfiles I remember, that my workdir will be ‘gino_admin’ folder, not directory ‘docker/’ where Dockerfiles are placed in.

In my case there will be two Docker files — one for the admin panel, one for the main app. PostgreSQL will be builded from the official image.

Dockerfiles

Pretty simple Dockerfiles. Install requirements, copy source code, run.

FROM python:3.7.7 as base
WORKDIR /app
COPY examples/fastapi_as_main_app/requirements.txt /app/requirements.txt
RUN pip install gino-admin==0.0.9
COPY examples/fastapi_as_main_app/src/admin.py examples/fastapi_as_main_app/src/db.py /app/
CMD python admin.py

For admin panel I also need to add gino_admin sources and install from them inside, because this is what I test — code, not releases from PyPi:

And for Main App:

FROM python:3.7.7 as base
WORKDIR /app
COPY examples/fastapi_as_main_app/requirements.txt /app/requirements.txt
RUN pip install gino==1.0.0 && \
pip install gino-starlette==0.1.1 && \
pip install fastapi==0.54.1 && \
pip install uvicorn==0.11.5
COPY examples/fastapi_as_main_app/src/main.py examples/fastapi_as_main_app/src/db.py /app/
CMD uvicorn main:app --host 0.0.0.0 --port 5050

Create test-docker-compose.yml

Let’s go and create a test-docker-compose.yml file with all our services that are needed for a correct test.

Go into tests/integration_tests/. Create test-docker-compose.yml. Pay attention to Context and Dockerfile path.

Our test-docker-compose.yml will be:

version: "3"

services:
postgres:
image: "postgres:9.6"
environment:
- POSTGRES_USER=gino
- POSTGRES_PASSWORD=gino
- POSTGRES_DB=gino
ports:
- "5432:5432"
volumes:
- ./data/postgres:/var/lib/postgresql/data

fastapi_main_app_main:
environment:
- DB_HOST=postgres
build:
context: ../../
dockerfile: $PWD/docker/fastapi_as_main_app/Dockerfile
ports:
- "5050:5050"
depends_on:
- postgres

fastapi_main_app_admin:
environment:
- DB_HOST=postgres
build:
context: ../../
dockerfile: $PWD/docker/fastapi_as_main_app/Dockerfile-admin
ports:
- "5000:5000"
depends_on:
- postgres

I need to set var DB_HOST =postgres because it uses inside code (here:) to decide that PostgreSQL host connects. If it runned inside docker-compose cluster — host postgres , if not — localhost .

Cool. Now time to run:

docker-compose -f test-docker-compose.yml up --build

Wait-for script

In our case we need to wait till PostgresDB will be ready to accept connections.

We will place it in same folder near compose .yml file

wait_for script

Content of the script will be very simple, to check DB connections I will use the same libraries that already exist as dependencies, but in your case you can use anything. Most simple & popular way — with psycopg2.

I will use gino, because as I said we use it anyway in App:

Not much logic — just try to connect, if error sleep 3 sec, try again.

#!/usr/bin/python
""" wait for PostgreSQL DB up """
from time import sleep
import asyncio
from gino import Gino


async def main():
db = Gino()
await db.set_bind('postgresql://gino:gino@postgres:5432/gino')
await db.pop_bind().close()


if __name__ == "__main__":
for _ in range(5):
try:
asyncio.get_event_loop().run_until_complete(main())
print("DB Connected")
exit(0)
except Exception as e:
print(e)
print("Postgres is not available. Sleep for 8 sec")
sleep(8)
else:
exit(1)

Now we need to modify Dockerfiles, before running servers we need to run this script. And when it returns 0 — run servers.

Change our last line in main app to:

COPY ../../tests/wait_for.py /wait_for.py
CMD python /wait_for.py && uvicorn main:app --host 0.0.0.0 --port 5050

And in admin panel to:

COPY tests/integration_tests/wait_for.py /wait_for.py
CMD python /wait_for.py && python admin.py

Note: same way you can run any pre-setup action that you need to make your service works.

Now run the cluster again with:

$ docker-compose -f test-docker-compose.yml up --build

And we get a successful result. All services up & work.

Greate. We have a test cluster, now we need something that allows a run test inside this cluster.

Pytest plugin for Docker Compose

If you never worked with Pytest make sense first of all take a look at official documentation and some tutorials. At least you need to understand that it is ‘fixtures’ — https://docs.pytest.org/en/latest/fixture.html .

Exist 3 different plugins for Pytest to run tests on docker-compose infrastructure:

I checked all 3, but at the end of the day stop on https://github.com/pytest-docker-compose/pytest-docker-compose . It works for me better than others in scope of speed and usage clearness.

Let’s start with packages installation:

Now, let’s create our conftest.py ( https://docs.pytest.org/en/2.7.3/plugins.html?highlight=re ) for pytest where we will define our custom path to docker-compose file that we use to run tests and plugin.

import os
import pytest

pytest_plugins = ["docker_compose"]


@pytest.fixture(scope="module")
def docker_compose_file(pytestconfig):
return os.path.join(str(pytestconfig.rootdir), "test-docker-compose.yml")

And right now we finished preparing our common infrastructure.

Next setup our first test module.

Create Tests Module

As you remember I have several examples and each of them — set of services that works with DB. For each example, I will have my own test module and own fixtures.

Services Fixtures

Create our first test_module: I will keep naming consistent and call it test_fastapi_as_main_app_example.py.

Now our tests folder looks like:

Tests structure
  1. To check that they up & run success
  2. To get their uri, that we will use in tests
import pytest
import requests
from requests.adapters import HTTPAdapter
from urllib3.util.retry import Retry


@pytest.fixture(scope="module")
def main_app_url(module_scoped_container_getter):
""" Wait for the api from fastapi_main_app_main to become responsive """
request_session = requests.Session()
retries = Retry(total=5, backoff_factor=3, status_forcelist=[500, 502, 503, 504])
request_session.mount("http://", HTTPAdapter(max_retries=retries))

service = module_scoped_container_getter.get("fastapi_main_app_main").network_info[0]
api_url = f"http://{service.hostname}:{service.host_port}"
return api_url


@pytest.fixture(scope="module")
def admin_url(module_scoped_container_getter):
""" Wait for the api from fastapi_main_app_admin to become responsive """
request_session = requests.Session()
retries = Retry(total=5, backoff_factor=3, status_forcelist=[500, 502, 503, 504])
request_session.mount("http://", HTTPAdapter(max_retries=retries))

service = module_scoped_container_getter.get("fastapi_main_app_admin").network_info[0]
api_url = f"http://{service.hostname}:{service.host_port}/admin"
return api_url

That need to pay attention:

Module_scoped_container_getter — this is a special fixture, provided by plugin. I use pytest-docker-compose module_scoped_container_getter , because each of my examples — a set of separate groups of apps with their own DB schema — for each module I will need to drop DB Tables and create them for a concrete example that I test in the module.

  • Function_scoped_container_getter
  • Class_scoped_container_getter
  • Module_scoped_container_getter
  • Session_scoped_container_getter

Next pay attention to line:

service = module_scoped_container_getter.get(
"fastapi_main_app_main").network_info[0]

In the .get(...) method you need to provide a service name as it is defined in docker-compose yml file. Add to the same test_fastapi_as_main_app_example.py file:

def test_main_service_run(main_app_url):
result = requests.get(main_app_url)
assert result.status_code == 200


def test_admin_service_run(admin_url):
result = requests.get(admin_url)
assert result.status_code == 200

Time to run tests.

How to run tests

To run the test you have 2 possible ways.

pytest . --docker-compose=test-docker-compose.yml -v

# will build and run docker compose & execute the tests

Second way allows you to reduce time in the process of tests creating/debugging because you don’t rebuild each run container.

docker-compose -f test-docker-compose.yml up --build# build & run test cluster# when in new terminal window:

pytest . --docker-compose=test-docker-compose.yml --docker-compose-no-build --use-running-containers -v

Choose any you would like and run the test.

And we got green result:

Great, all works. Services up and we can use them from tests.

Auth fixtures

Now let’s add more fixtures. I want to test the REST API of the admin panel, but to make API calls I need to get an auth token first.

So I will define 2 fixtures — 1 to auth , 2 to upload preset

@pytest.fixture(scope="module")
def admin_auth_headers(admin_url):
""" get auth token """
headers = {"Authorization": "admin:1234"}
result = requests.post(f"{admin_url}/api/auth/", headers=headers)
token = result.json().get("access_token")
headers = {"Authorization": f"Bearer {token}"}
return headers


@pytest.fixture(scope="module")
def initdb(admin_url, admin_auth_headers):
""" run api call with auth token """
result = requests.post(f"{admin_url}/api/presets/",
json={"preset_id": "preset_1", "drop": True},
headers=admin_auth_headers)
assert result.status_code == 200

Now will write test to main app endpoint /users this endpoint returns { “count_users” : await db .func.count (User.id) .gino.scalar ()}

So /users endpoint must returns {“count_users”: 5}

Add test:

def test_main_service_users(main_app_url, initdb):
result = requests.get(f'{main_app_url}/users').json()
assert result
assert result == {"count_users": 5}

Pay attention to initdb fixture, we don’t use it in method call, but we need to call initdb fixture, because for this test we need to have Data in DB

Our initdb fixture has scope=”module” it means that it will be executed only once per module, if you want recreate db for each test — change it to “function”

Awesome. That’s it.

In my case the next step was — add more modules and more tests for each example.

And source code with tests samples you can find here: https://github.com/xnuinside/gino-admin/tree/master/tests/integration_tests

In your case maybe you need to use scope = “session” if you care about test speed and your code doesn’t have side effects or maybe opposite you need scope = “functions” — check more about it in Pytest docs.

Such integration tests are very helpful when you have 4–5 services that call each other and you need to be sure that they work together correctly.

I hope it will be useful for someone. If you will see any errors — feel free to comment, I will try to fix them fast.

Originally published at http://www.xnu-im.space.

--

--