Tests for https://www.great.gov.uk.
This repository contains:
- Tests suites:
- directory_tests_shared - a sub-project with shared test code
- Additional non-functional tests & tasks:
- Content diff test - discover content differences between the same page but hosted on two different environments
- Dead link checker - identify broken and invalid links
- Geckoboard updater - update Geckoboards with test results fetched from CircleCI and Jira board stats
- CMS page checker and reporter - two simple CMS related tests/tasks
- test_cms_pages_return_200.py - check if all CMS pages can be visited without any error
- generate_page_status_report.py - generate a simple CMS page status report
You'll need:
- Python 3.8+
- pip to install required dependencies
- virtualenvwrapper to create a virtual python environment
Following instructions apply to all test suites and tasks included in this repository.
git clone https://github.com/uktrade/directory-tests
cd directory-tests
mkvirtualenv -p python3.8 {browser|functional|load|periodic_tasks|smoke|tests_shared}
make requirements_{browser|functional|load|periodic_tasks|smoke|tests_shared}
For all additional configuration instructions please check dedicated README.
There are separate sets of requirement
files for every test suite and also for periodic tasks
& directory_tests_shared
.
We use pip-compile
(part of pip-tools package) to compile pinned requirements_*.txt
files from unpinned requirement_*.in
files.
List of unpinned requirements files for all tests suites, tasks and sub-projects:
- directory_tests_shared/requirements.in
- requirements_browser.in
- requirements_functional.in
- requirements_load.in
- requirements_smoke.in
- requirements_periodic_tasks.i
- this is a common set of requirements for: Geckoboard updater, Content diff test, Dead link checker & CMS page checker and reporter
To add, remove or change any dependency:
- first edit appropriate
requirements_*.in
file - do not used pinned version of any dependency (unless the latest version of specific dependency introduced a braking change)
- recompile requirements file
make compile_requirements_{browser|functional|load|periodic_tasks|smoke|test_tools|tests_shared}
To recompile all requirement files at once use:
make compile_all_requirements
To easily switch between env var configurations for different test environments (DEV, Staging, UAT & periodic tasks),
please use dedicated convenience shell scripts. You can find those scripts in Rattic (look for DIT test env vars
).
Once you get hold of them you'll be able to locally run any test suite or task against specific test environment.
To make switching between configurations easier and faster, you can create handy shell aliases, like:
alias dev='source ~/dir-dev.sh';
alias stage='source ~/dir-stage.sh';
alias uat='source ~/dir-uat.sh';
alias geckoboard='source ~/geckoboard.sh';
alias periodic='source ~/periodic.sh';
All required and optional Service URLs and Secrets such as API keys are specified in env.json.
In order to run any test suite or task from this repository, locally or remotely, you'll have to have all of those env vars exported to shell.
The reason why env vars are specified in aforementioned env.json
file is because initially the tests we executed locally in docker containers and later on also in CircleCI.
Setting env vars for a docker container via env_file requires a simple key:val
file.
Whereas CircleCI allows you to either use a shell script that uses export
command to set env vars or define env vars in project settings.
The latter solution is preferred but it's not ideal because there's no easy way to have a single project in CircleCI with different sets of secrets for different test environments.
Moreover exporting env vars locally also requires a shell script that use export
command to do the same.
In order to address all of those requirements and have only one place where we specify all mandatory (and optional) env vars (for all test environments)
we've decided to use env vars prefixed with the name of test environment e.g. DEV_
, STAGE_
, UAT_
, & PROD_
.
We've also wrote a convenience script env_vars/env_writer.py which:
- checks if required env vars are set
- saves env vars without name prefix in files that can be used by locally, in a docker container or on a CI to export env vars.
This scripts takes two arguments:
--env
- look for environment variables prefixed with "ENV_
", whereENV
can be e.g.:DEV
,STAGE
,UAT
orPROD
--config
- specify input config file [defaults to: ./env_vars/env.json] with a list of required and optional env vars and outputfile_path
.
Lets explain how this script works in a quasi-BDD scenario:
Given all required env vars are specified in env.json
file e.g.: "CMS_API_KEY
" & "CMS_API_URL
"
When env_writer.py
is executed with --env=DEV
and --config=env.json
Then it will check if both "DEV_CMS_API_KEY
" & "DEV_CMS_API_URL
" env vars are set
And it will generate two env files (only if both env vars are set):
1) .env_with_export
- a shell script which sets env vars with export
command. (can be used with source
or .
shell commands)
2) .env_without_export
- a simple key:val
file. (can be used by docker
or docker-compose
)
Or it will raise an exception if any required env var is missing
Using env vars with test environment prefixes allowed us to keep multiple sets of secrets in one CircleCI project.
All code in this repository is formatted with black & isort.
It is highly advisable to use these tools to maintain a consistent code style across all test suites.
To automate the process of code formatting with all aforementioned code formatters, code checkers
(like flake8) and other linters you can use use pre-commit tool.
A ready to use pre-commit
configuration is in .pre-commit-config.yaml.
"Dead link checker" is a simple tests that use pylinkvalidator to crawl any web site and report link errors like 500 ISE or 404 Not Found.
The main makefile
includes a dedicated makefile_dead_link_checker.mk.
In that additional mk
file you'll find targets to run "Dead link checker" test against any test environment.
"Dead link checker" test is controlled with 3 env vars:
TEST_ENV
- name of the test environment, useDEV
,STAGE
,UAT
orPROD
{DEV|STAGE|UAT}_BASICAUTH_USER
- username for IP Filtering Service authentication (required on all non-Production environments){DEV|STAGE|UAT}_BASICAUTH_PASS
- password for IP Filtering Service authentication (required on all non-Production environments)
To run this test locally against DEV
environment:
- Enable venv for periodic tasks (
workon
command is part of virtualenvwrapper package)workon periodic_tasks
- Export env vars with
dir-dev.sh
convenience script# if you configured an alias then: dev # or source directly with: source ~/dir-dev.sh
- Run the test
make dead_links_check
It is also possible to run this test without using env var convenience script.
Simply provide all required env vars to the make target:
workon periodic_tasks
TEST_ENV=DEV DOMESTIC_URL='https://url.to.dev.environment/' DEV_BASICAUTH_USER='username' DEV_BASICAUTH_PASS='password' \
make dead_links_check
You can find credentials for IP Filtering Service:
- in env var convenience scripts
- Rattic
- or ask WebOps team about them
All test suites, periodic tasks & tests are executed every day (Mon through Fri) in CircleCI against Dev
, Staging
& UAT
environments.
Test workflows are defined in .circleci/config.yml.
Workflows were designed with DRY rule in mind. Code duplication is kept to necessary minimum.
Every workflow consists of one or more jobs. A job is built with multi-step blocks.
Every step should have "Single Responsibility" like installing requirements or running tests.
Thanks to this approach we've reduced code duplication and increased readability.
More on that topic in CircleCI Reusable Config Reference Guide