-
Notifications
You must be signed in to change notification settings - Fork 11
Home
Writing automated tests is an essential part of the development process but is not unusual to find codebases with little or no test coverage. Comparing to unit tests, the integration tests brings even more challenges as it gets the same results running in different OS's (different developer machines and CI pipelines).
In this article, we are considering integration tests as a specialized type of automated tests where software functionality is combined and tested as a group. Integration tests (also called service tests) are at the middle of the test pyramid and their purpose is to verify that separately develop modules work together properly. As opposed to unit tests, integration tests can depend on databases and external apis.
The framework makes available great possibilities in terms of integration tests. One of them is the EF Core In-Memory which is a very good option and fits in different scenarios but it doesn't reproduce exactly a database behavior. So if you're searching for an approach to use with different ORM's (Entity Framework, NHibernate, Dapper, etc.) or an approach that will bring you other possibilities (as up an environment as code), we think docker-compose
with integration tests approach will be very useful to you.
We chose a TimeSheet application as an example for this article. In that application, the employee of a fictional company will submit worked hours throughout the week. The API layers are separated by folders and there are only two .csproj
files, one for API and the other for the tests. The code is hosted on github.
The docker-compose
starts three containers: timesheets-api
, sql-server-database
and integration-tests
. The first and third ones are the same image (dotnet/sdk:5.0
), timesheets-api
to run the API inside the container and the integration-tests
to go up, run tests and get down. Both services configuration map host volumes to be used inside the containers. This setting allows the container to use the source code implemented on the host machine.
version: "3.4"
services:
timesheets-api:
container_name: timesheets-api
image: mcr.microsoft.com/dotnet/sdk:5.0
environment:
- ConnectionStrings__TimeSheetContext=Data Source=sql-server-database; Initial Catalog=TimeSheets; User Id=sa; Password=1AvenueCodePassword*
volumes:
- ./Timesheets:/api
working_dir: /api
depends_on:
- sql-server-database
command: "dotnet run"
ports:
- "8001:8081"
integration-tests:
container_name: integration-tests
image: mcr.microsoft.com/dotnet/sdk:5.0
environment:
- ConnectionStrings__TimeSheetContext=Data Source=sql-server-database; Initial Catalog=TimeSheets; User Id=sa; Password=1AvenueCodePassword*
volumes:
- .:/src
- ../test:/test
working_dir: /src
command:
[
"./Scripts/wait-for-it.sh",
"sql-server-database:1433",
"--",
"dotnet",
"test",
"../test/Timesheets.Tests.csproj"
]
depends_on:
- sql-server-database
The timesheets-api
container is start up with a dotnet run
. The integration-tests
has a more complicated command. The script wait-for-it.sh
keeps checking if the provided port of the service (sql-server-database:1433) is available. Once it gets the right response the dotnet test
run executing the tests.
The sql-server-database
, as the name says, works as the containerized database to run the API pointing to the database (environment as code), or as a test database. The image is a SQL Server (mssql/server
). Also, we need to configure the database service in the compose file to expose the 1433
port and bind it in the host's same port.
sql-server-database:
container_name: sql-server-database
image: mcr.microsoft.com/mssql/server
environment:
SA_PASSWORD: 1AvenueCodePassword*
ACCEPT_EULA: "Y"
ports:
- "1433:1433"
The database connection string is created on the appsettings.json
:
{
"ConnectionStrings": {
"TimeSheetContext": "Data Source=localhost; Initial Catalog=TimeSheets; User Id=sa; Password=1AvenueCodePassword*"
},
...
}
As you can see, our application will point to a localhost database when running. It's important to override the connection string to work on containers network, which was done in the environment step of integration-tests
and timesheets-api
. We chose an approach that uses only one appsettings.json
and many overrides. That approach helps developers to avoid spreading appsettings.json
files along with the code (which we consider a code smell) and even if the developer chooses to use different .env
files in docker-compose
environment step, they are going to be centralized in the same place in the folder's hierarchy.
To run the project locally, if you want to debug in your favorite IDE, for example, you only need to up SQL Server: docker-compose up -d sql-server-database
. Otherwise, if you want to run an environment as code approach, run in your bash: docker-compose up -d timesheets-api
, that command will initialize the database (SqlServer) and the API on docker. Finally, if you want to run the integration tests, run this command: docker-compose up integration-tests
, that command will up the .Net 5.0 and SQL Server images, will run tests and in the end will down them.
We used the Github Actions to run the tests when any PR or modification at the master
branch is done.
To set up the workflow create a new file in the .github/workflows
named integration-tests.yml
name: Tests
# Run this workflow every time a new commit pushed to your master branch or a PR is created
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
test:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Execute tests
working-directory: ./src
run: docker-compose up --exit-code-from integration-tests integration-tests
This action starts the integration-tests container with all dependencies. The argument --exit-code-from
uses the exit code of the selected service container as the result of the action.
When the file is committed to the GitHub repository the action will be available to be triggered on every commit to the master branch or when a PR to the master branch is created.
To see the workflow results click Actions.
It is also possible to expand the logs to see the details. If a test fails we can take a look at the results to troubleshoot.
Integration tests with docker-compose
is a very nice replacement option to in-memory databases. The article showed in a quickly and easy way how to configure a production ready integration test approach.
Therefore, this tutorial demonstrated how to run integration tests using the new .Net 5, containers with docker-compose
and github actions
for a Continuous Integration. It's helpful to test the application from a request to a database persistence and this approach can also be applied using other technologies rather than .Net and github actions
.
Check out more about integration tests in ASP.NET Core here, and tell us how this approach works for you in the comments below!
*Post writen in partership with Alvaro Kramer, Rafael Miranda and Stefano Bretas