This app is designed as a comprehensive Django-based application that automates and optimizes room pricing strategies by leveraging real-time data processing and a responsive dashboard interface. Below is an overview of the key components and their roles within the system:
-
Core:
- Django ASGI & WSGI: Configurations for asynchronous and synchronous web servers.
- Celery: Used for handling asynchronous task queues to manage long-running processes without blocking the main server.
- Routers: Defines URL route handlers that direct incoming requests to appropriate views.
- Settings: Contains all configuration settings for the Django project, including database configurations, third-party apps, and middleware settings.
- URLs & Views: Define the logic and endpoints for user interactions and API responses.
-
Dashboard Service:
- Models: Includes the
DashboardData
table, which stores metrics and statistics for display on the dashboard. - Serializers: Handle the conversion of model instances to JSON for API responses.
- Tasks: Asynchronous tasks, such as the generation of dashboard metrics, are scheduled and executed.
- Admin: Django admin configurations for managing dashboard data through a graphical interface.
- Tests: Unit tests to ensure the reliability and integrity of dashboard functionalities.
- Models: Includes the
-
Data Provider:
- Management Commands: Scripts to load external CSV data into a Redis queue, facilitating quick and efficient data handling.
- Models: Contains the
Event
table to log and track data events processed by the system. - Tasks: Background tasks for loading events from the Redis queue into the database.
- Tests: Comprehensive tests to validate data integrity and the correct functioning of data processing workflows.
- Docker: Utilizes Docker containers to encapsulate the application environment, ensuring consistency across different development and production setups.
- Gunicorn: Serves as the WSGI HTTP Server to run Python web applications from a Docker container, particularly suited for handling concurrent requests.
- Poetry: Manages dependencies and packages, providing a reproducible environment and simplifying package management and deployment.
- Environment Setup: Utilize the
.env
file to configure necessary environment variables such as database URLs, API keys, and other sensitive configurations. - Docker Compose: Simplifies the deployment of multi-container Docker applications, allowing each component to be launched with predefined settings.
- Static: Serves static files for the root api frontend.
App/
│
├── .env # Environment variables for the project
├── gunicorn.conf.py # Gunicorn configuration for Django
├── manage.py # Django's command-line utility for administrative tasks
├── docker-compose.yml # Docker configuration file
├── LICENSE # Project license
├── poetry.lock # Poetry package versions lockfile
├── pyproject.toml # Poetry configuration file
├── pytest.ini # Pytest configuration file
├── README.md # The file you are reading
│
├── app/ # Main Django project directory
│ ├── __init__.py
│ ├── asgi.py
│ ├── celery.py
│ ├── routers.py
│ ├── settings.py
│ ├── urls.py
│ ├── views.py
│ └── wsgi.py
│
├── dashboard_service/ # Dashboard service module
│ ├── migrations/ # dashboard_service db migrations
│ ├── tests/ # Dashboard service unit tests
│ ├── admin.py
│ ├── apps.py
│ ├── models.py
│ ├── serializers.py
│ ├── tasks.py # dashboard service generation task
│ ├── urls.py
│ └── views.py
│
├── data_provider/ # data provider module
│ ├── management/ # data provider commands
│ ├── migrations/ # data_provider db migrations
│ ├── tests/ # Data provider unit tests
│ ├── admin.py
│ ├── apps.py
│ ├── models.py
│ ├── serializers.py
│ ├── tasks.py # data provider event loading tasks
│ ├── urls.py
│ └── views.py
│
├── static/ # Static files for the web application
└── templates/ # Django templates directory
-
Django: The primary framework for building the web application, handling the API endpoints, ORM for database interactions, and the admin interface.
-
Django REST Framework: Provides powerful and flexible tools for building RESTful APIs.
-
Celery: An asynchronous task queue/job queue used to manage periodic tasks for data simulation and dashboard updates.
-
Redis: Used as the message broker for Celery to handle task queues.
-
PostgreSQL/SQLite: PostgreSQL is recommended for production, while SQLite is used for development and testing to simplify setup.
IMPORTANT: Because it uses SQLite, there may be some db locks if the number of workers for Celery is increased. It is kept as 1.
-
Docker: Containerizes the application, ensuring consistent environments across different deployment stages.
-
drf-yasg: Automatically generates Swagger/OpenAPI documentation for the APIs.
-
Poetry: Dependency management and packaging tool for Python projects.
-
Pytest: Testing framework to ensure the code quality and functionality of the application.
-
Requests-mock: Library for mocking HTTP requests in tests.
The depends_on
directive in Docker Compose ensures that the celery
and celery-beat
services start after the redis
service, but it doesn't guarantee that the web
service will be fully ready before starting celery
and celery-beat
.
To ensure celery
and celery-beat
start only after the web
service is fully ready (i.e., after migrations are applied), we won't start them immediately with docker-compose up
. Instead, we'll manually start them after setting up the databases.
Here's a step-by-step guide, including test API calls for each endpoint:
First, clone the GitHub repository to your local machine:
git clone git@github.com:arkeodev/django-price-manager.git
cd django_price_manager
Build the Docker containers:
docker-compose build
Start the Redis and Web service containers (but not the Celery containers yet):
docker-compose up -d redis web
Run migrations for the data_provider
database:
docker-compose run web bash -c "poetry run python manage.py migrate --database=data_provider"
Run migrations for the dashboard_service
database:
docker-compose run web bash -c "poetry run python manage.py migrate --database=dashboard_service"
Run the tests using pytest
:
docker-compose run web bash -c "poetry run pytest"
Test Event API to verify that the service is working correctly.
Check that I'm using year 2019 in the below call, because incoming data from the data.csv file is greater than this year and it allows us to run the tasks.
curl -X POST http://localhost:8000/events/ -H "Content-Type: application/json" -d '{"hotel_id": 1, "event_timestamp": "2019-01-01T00:00:00Z", "status": 1, "room_reservation_id": "0013e338-0158-4d5c-8698-aebe00cba360", "night_of_stay": "2019-01-01"}'
curl -X GET http://localhost:8000/events/?hotel_id=1
Once the databases are set up, start the Celery workers in the order below and beat service.
Start the Celery event processing task worker:
docker-compose up -d celery
Start the Celery dashboard generation task worker:
docker-compose up -d celery-update
This command is pushing all the data into a Redis queue.
docker-compose run web bash -c "poetry run python manage.py trigger_load_events"
Start the Celery beat service:
docker-compose up -d celery-beat
Access the application root page from a browser
Access the Swagger documentation page from a browser
In swagger you can test the the Dashboard API
First, clone the GitHub repository to your local machine:
git clone git@github.com:arkeodev/django-proce-manager.git
cd django_price_manager/django_price_manager
Run the following commands to prepare your databases:
poetry run python manage.py migrate --database=data_provider
poetry run python manage.py migrate --database=dashboard_service
You'll need four different terminals for this process:
-
Terminal 1: Running the app with Gunicorn This command starts the gunicorn server.
poetry run gunicorn django_price_manager.wsgi:application --bind 0.0.0.0:8000
-
Terminal 2: Running Celery event processing task worker This command starts the Celery event processing
poetry run celery -A django_price_manager worker -l info -c 3
-
Terminal 3: Running Celery dashboard generation task worker This command starts the Celery dashboard generation task.
poetry run celery -A django_price_manager worker -l info -Q dashboard_queue -c 1
-
Terminal 4: Running Celery beat The first command is pushing all the data into a Redis queue.
poetry run python manage.py trigger_load_events
The second command starts the periodic tasks.
poetry run celery -A django_price_manager beat -l info
-
Run All Tests:
poetry run pytest
-
Test Structure:
data_provider/tests/test_models.py
: Tests for Data Provider models.data_provider/tests/test_serializers.py
: Tests for Data Provider serializers.data_provider/tests/test_views.py
: Tests for Data Provider views.dashboard_service/tests/test_tasks.py
: Tests for Data Provider generation periodic tasks with mocking.dashboard_service/tests/test_models.py
: Tests for Dashboard Service models.dashboard_service/tests/test_serializers.py
: Tests for Dashboard Service serializers.dashboard_service/tests/test_views.py
: Tests for Dashboard Service views.dashboard_service/tests/test_tasks.py
: Tests for Dashboard periodic tasks with mocking.
Mocking is used in tests to simulate external service interactions, ensuring tests run in isolation and are not dependent on external systems. We use the unittest.mock
module to mock HTTP requests in the tests for the Dashboard Service.