The Lunch Place Decision Service is an internal tool designed for companies to facilitate their employees in choosing a lunch place based on daily menus uploaded by restaurants.
- Restaurants receive a unique API Key upon registration, allowing them to upload their daily menus.
- Employees authenticate using JWT tokens to vote on menus.
- Company Admin has the authority to register both Restaurants and Employees into the system.
- Company Admin can create a restaurant profile in the system, which generates an API Key for the restaurant.
- Restaurants can upload their menu for each day using their API Key for authentication.
- Company Admin can register employees, enabling them to vote on the daily menus.
- Old API Version: Allows employees to vote for one menu.
- New API Version: Enables employees to vote for their top three menus, assigning points from 1 to 3 for their top choices.
- Votes via the old API version create or update a vote record with the top points for the day.
- Votes via the new API version create or update a vote record with points for the top three choices for the day.
- Anybody can view the current day's voting results, showing the most popular menus.
Open a web browser and go to http://localhost:8000/swagger-ui/
(adjust the port if necessary). You should see the Swagger UI loaded with your API documentation based on the schema.yml
file.
- Header Name:
X-API-VERSION
- Purpose: Allows clients to specify the desired version of the API for each request, ensuring compatibility and access to specific features available in that version.
- Format: The version value should be in the format
major.minor
(e.g.,1.0
), where:major
represents significant changes and potentially breaking changes.minor
represents backward-compatible improvements and additions.
If X-API-VERSION
header in their requests, to indicate which version of the API they wish to use, is None
or invalid, the default version of API would be selected.
- Python 3.8+
- Django 3.2+
- Django REST Framework
- Clone the repository and navigate into it.
- Install dependencies:
pip install -r requirements.txt
- Run migrations:
python manage.py migrate
- Start the server:
python manage.py runserver
To ensure code quality, readability, and type correctness, we employ a series of linting and typing checks. These checks are encapsulated in a script named lint.sh
, which should be run before committing changes to the repository.
The lint.sh
script automates the process of linting and typing checks. It runs the following tools:
- Flake8: Checks for coding style issues.
- AutoPEP8: Automatically formats Python code to conform to the PEP 8 style guide.
- isort: Sorts imports alphabetically, and automatically separated into sections.
- Mypy: Performs static type checking.
To run the script, navigate to your project's root directory in the terminal and execute:
./lint.sh
Ensure the script has executable permissions. If not, grant them with:
chmod +x lint.sh
Our project uses pytest
for running unit and integration tests, as well as a comprehensive suite of all tests. This ensures that new changes do not break existing functionality and that the application behaves as expected.
- Unit Tests: Focus on testing individual components in isolation.
pytest -m "unit"
- Integration Tests: Test the integration between different components or systems.
pytest -m "integration"
- All Tests: To run the entire test suite without filtering.
pytest
- Run Frequently: Execute
lint.sh
and the relevantpytest
commands frequently during development. This helps catch and fix issues early, improving the overall quality of the codebase. - Pre-commit Hooks: Consider setting up pre-commit hooks that automatically run
lint.sh
andpytest -m "unit"
to ensure that commits meet the quality standards. - Continuous Integration (CI): Integrate these tools into your CI pipeline to automatically check every push or pull request. This ensures that all contributions are vetted for quality before merging.
-
Generate the Schema
Run the following command in your terminal to generate the initial
schema.yml
file:python manage.py generateschema > schema.yml
This command will create a
schema.yml
file in your project root directory containing the OpenAPI schema for your project.
You might want to customize or correct the automatically generated schema to better match your API's functionality or to add additional information that DRF's schema generation might not include.
-
Open the Swagger Editor
Visit Swagger Editor in your web browser.
-
Import Your Schema
In the Swagger Editor, you can either paste the content of your
schema.yml
directly into the editor or use the "File" menu to import theschema.yml
file. -
Edit and Validate Your Schema
Once your schema is loaded into the editor, you can make any necessary adjustments. Swagger Editor provides real-time validation and feedback on your schema, which can help you identify and fix issues.
-
Export the Updated Schema
After you've made your changes, use the "File" > "Save as YAML" option in Swagger Editor to download the updated schema. Save this file as
schema.yml
again.
To serve the updated schema.yml
through your Django application, you should place it in a static files directory. Assuming your project has an app named lunch_service
,
Move the Schema File
Move your updated schema.yml
file to the lunch_service/static/
directory.
Open a web browser and go to http://localhost:8000/swagger-ui/
(adjust the port if necessary). You should see the Swagger UI loaded with your API documentation based on the schema.yml
file.
Elevating the quality, usability, and security of your API involves incorporating advanced features that cater to the evolving needs of users and the system itself. Here are essential enhancements to consider for future development:
-
Introduction of
vote_points
in theMenu
Model- Objective: To streamline the process of selecting a menu for lunch by minimizing the need for aggregation queries in the Vote Model. Incorporating a
vote_points
field directly within theMenu
model can significantly enhance response times and overall performance.
- Objective: To streamline the process of selecting a menu for lunch by minimizing the need for aggregation queries in the Vote Model. Incorporating a
-
Using Django REST Framework API Key.
- Objective: Right now, for simplicity, Implemented API Key authentication in IsRestaurant permission declaration. But for complex usage, we can use Django REST Framework API Key.
-
Creation of an
Item
Model Linked toMenu
- Objective: To lay the groundwork for extended functionality and scalability. By establishing an
Item
model with a ForeignKey connection to theMenu
model, the system can offer more detailed and structured data management. Initially, for simplicity, a TextField attribute is used to list items.
- Objective: To lay the groundwork for extended functionality and scalability. By establishing an
-
Provide setting configuration per environment like production, staging, testing
- Objective: To lay the groundwork for extended functionality and scalability. By establishing an
Item
model with a ForeignKey connection to theMenu
model, the system can offer more detailed and structured data management. Initially, for simplicity, a TextField attribute is used to list items.
- Objective: To lay the groundwork for extended functionality and scalability. By establishing an
-
Rate Limiting and Throttling
- Objective: These measures are crucial for safeguarding your API against overutilization and potential abuse. By setting a cap on the number of requests a user can make within a specific period (rate limiting) and controlling the request rate (throttling), you can ensure equitable access for all users while preventing service disruptions or slowdowns due to excessive traffic.
-
Pagination
- Objective: This feature enhances both the performance and user experience of your API by partitioning large datasets into smaller, more manageable segments. Pagination facilitates more efficient data retrieval and processing for clients, significantly reducing server load and streamlining data handling.
- Docker: Ensure you have Docker installed on your system. If not, follow the installation instructions for Docker.
- Docker Compose: Ensure Docker Compose is installed. It usually comes with Docker for Windows and Docker for Mac, but you might need to install it separately on Linux. See the Docker Compose installation guide.
-
Clone the Repository
Start by cloning the project repository to your local machine.
git clone [repository URL] cd [project directory]
-
Environment Configuration
Copy the sample environment configuration file and adjust the variables according to your local environment.
cp .env.example .env
Be sure to replace values like database credentials, secret keys, and any other necessary configurations. If you are going to run using docker, you should set the
POSTGRES_HOST
as the database service namedb
-
Build the Docker Containers
Use Docker Compose to build the application and database containers.
docker-compose up --build
This command builds the images for your services (as defined in
docker-compose.yml
) and starts the containers. The first build might take some time as it needs to fetch the base images and install dependencies. -
Database Migrations
After the containers are up and running, you need to apply database migrations to set up the initial database schema.
docker-compose exec web python manage.py migrate
-
Create a Company Admin (Optional)
To access the admin panel, you'll need to create a superuser account.
docker-compose exec web python manage.py create_company_admin
-
Accessing the Application
Once everything is set up, you can access the application by navigating to
http://localhost:8000
in your web browser. Adjust the port according to yourdocker-compose.yml
configuration if necessary.
Here are some common commands you'll use during development:
- Starting the Containers:
docker-compose up
(add-d
for detached mode) - Stopping the Containers:
docker-compose down
- Viewing Container Logs:
docker-compose logs
(add-f
to follow the log output) - Running Management Commands:
docker-compose exec web python manage.py [command]