In order to use the provided development environment install Docker Desktop for Mac or Windows. On Linux, make sure you have the latest version of Compose.
This project has a docker-compose.yml file, which will start the Django application on your local machine.
Clone the repository and start the development environment in the created directory:
docker compose up -d
(Should you recieve errors like docker/transport/unixconn.py
, try starting docker desktop first, then run the compose command.)
Now, the API is available at http://localhost:8000/api/.
curl http://localhost:8000/api/
# => {"users":"http://localhost:8000/api/users/"}
Access the console of the Docker container running the backend app:
docker compose exec app bash
python manage.py createsuperuser
TODO
Please use black as code formatter. https://black.readthedocs.io/en/stable/
We use # fmt: off
whenever we want to specify our own formatting style.
VSCode hints:
- Use "Format Document" to manually format
- Install the Microsoft Python Extension for VSCode
- Configure "Python > Formatting Provider": "Black" (
"python.formatting.provider": "black"
)
- Configure "Python > Formatting Provider": "Black" (
- Make sure your VSCode Python Version in "Python: Select interpreter" sie same as your docker terminal
root@123406906c90:/code# which python
Using the Docker extension
is recommended to develop in VSCode. It requires special ./.devcontainer
setting files hat a fellow developer can provide.
Configuration options are set through environment variables, all of which are listed in docker-compose.yml
.
- Production Backend: https://api.fixmyberlin.de/admin/reports/
- Production Frontend: https://fixmyberlin.de/meldungen/radbuegel/friedrichshain-kreuzberg/landing, https://radbuegel-aachen.de/meldungen
- Production Backend: https://api.fixmyberlin.de/admin/fixmyapp/gastrosignup/
- Part of this is
/permits
, https://api.fixmyberlin.de/admin/permits/ which are events that temporary take up parking space.
- Part of this is
- Production Frontend: https://fixmyberlin.de/friedrichshain-kreuzberg/terrassen
- Production Backend: https://api.fixmyberlin.de/admin/fixmyapp/
/project
("Planungen") are part of the Planungskarte. The segments are created manually to reflect a "Planung".- This also use the
/question
(FAQ for the "Planungen panel") - Images are static in S3 and updated via an Excel-Import (filename in column); Might be change to a Django upload based approach.
- This also use the
/section
,/sectiondetails
are what make up the HBI. The segments are extract from offical street data./sectionaccidents
/profile
are part of the HBI to create custom HBI versions; not used anymore.
- Production Frontend: https://fixmyberlin.de/planungen, https://fixmyberlin.de/zustand
-
EVENT_RECIPIENT
(string): An email address to which email notifications for accepted or rejected applications are sent be forwarded to the actual recipients. -
EVENT_SIGNUPS_OPEN
(string): An iso 8601 formatted datetime which defines the beginning of the event permit application signup timeframe.If undefined, applications are always open. If the value can not be parsed, sign ups are always closed.
-
EVENT_SIGNUPS_CLOSE
(string): Equivalent for closing date and time.
- Production Backend: https://api.fixmyberlin.de/admin/fixmyapp/playstreetsignup/
- Production Frontend: https://fixmyberlin.de/friedrichshain-kreuzberg/spielstrassen
Note there is also spielstraßen.de which might be using the static geojson extract of this from github.com/FixMyBerlin/fixmy.spielstrassen
- Production Backend: https://api.fixmyberlin.de/admin/fahrradparken/
- Production Frontend: https://www.radparken.info
- Production Backend: https://api.fixmyberlin.de/admin/survey/
- Production Frontend:
- Survey: N/A
- Report: https://fixmyberlin.de/research/subjektive-sicherheit
- User, Groups
TODO
TODO
TODO
TODO
-
Get an overview of all commands available:
docker compose exec app bash python manage.py
-
Run the test suite:
docker compose exec app bash python manage.py test
-
Run the test suite with pdb for debugging
./manage.py test --pdb
Using debugpy:
You can enable interactive debugging through debugpy by setting the environment variable DEBUGPY=1
(e.g. through a .env
file).
Using pdb:
- Docs https://docs.python.org/3/library/pdb.html
import pdb; pdb.set_trace()
– Break into the debugger from a running program.l
– List source code for the current file.a
– Print the argument list of the current function.p some_variable
– Evaluate the expression in the current context and print its value.
Removes personal information from the database. By default it preserves data of staff.
python manage.py anonymizedata
Downloads content of S3 bucket to /tmp, filtered by path prefix. This command requires the environment variables AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
, AWS_S3_REGION_NAME
and AWS_STORAGE_BUCKET_NAME
to be set correctly.
python manage.py downloaddata Data/
Exports projects as GeoJSON intended for Mapbox.
python manage.py exportprojects > /tmp/projects.json
Exports sections as GeoJSON intended for Mapbox.
python manage.py exportsections > /tmp/sections.json
Similar to exportreports
, this command exports play street signups to a csv file.
python manage.py exportplaystreets spielstrassen.csv
This command exports GastroSignup entries, either in CSV or GeoJSON format. The GeoJSON format has an additional switch that allows exporting usage areas instead of the geolocation of the entry's shop.
The CSV format doesn't include usage areas and uploaded certificates.
The GeoJSON format doesn't include any personal information.
In order to export the requested usage areas in GeoJSON format into a file gastrosignup_area.geojson
:
python manage.py exportgastrosignups --format geojson --area gastrosignup_area.geojson
Bootstrap the database for HBI. Downloads and imports road and intersection data, section details and section accidents, also applying migrations.
python manage.py updatehbi
Imports sections from shape file. The file is usually downloaded from S3 with downloadfiles
.
python manage.py importsections /tmp/road_sections-v1.1.shp
python manage.py importsections /tmp/intersections-v1.1.shp
Imports section details including pictures and traffic information from CSV file. The file is usually downloaded from S3 with downloadfiles
.
python manage.py importsectiondetails /tmp/section_details-v1.1.csv
Import section accident data set, which references previously imported sections.
python manage.py importsectionaccidents /tmp/section_accidents-v1.1.csv
Updates project geometries from a shape file. The file is usually downloaded from S3 with downloadfiles
.
python manage.py updateprojectgeometries /tmp/projects.shp linestring
Uploads GeoJSON export of projects or sections (see exportprojects
and exportsetcions
) to Mapbox. This command requires the environment variables MAPBOX_UPLOAD_NAME_SECTIONS
, MAPBOX_UPLOAD_TILESET_SECTIONS
, MAPBOX_UPLOAD_NAME_PROJECTS
and MAPBOX_UPLOAD_TILESET_PROJECTS
to be set correctly.
python manage.py uploadtileset --dataset projects /tmp/projects.json
Export reports about bike stands in either CSV or GeoJSON format
python manage.py exportreports --format csv /tmp/reports.csv
Notes:
- Likes are exported as an aggregate count
- Report creation date is exported as an ISO 8601 formatted datetime string
Import or update entries in the reports app from a CSV file.
The imported CSV file is required to have the columns;
- id: entry id that should be updated by this row. Leave empty to create new entries.
- origin_ids: semicolon-separated list of reports that are implemented by this planning
- status: one of the report statuses as can be found in
reports.models.report
- address: address including post code
- lat
- long
- description: a description of the planning
- status_reason: a reason if the planning's status is 'invalid'
- number: number of bike stands built in this planning
python manage.py importreports reports.csv
Batch send notifications, which are enqueued when a reports status is changed
python manage.py sendnotifications
This command requires the environment variables REPORTS_NOTIFICATION_CAMPAIGN
and REPORTS_NOTIFICATION_SENDER
to be set. You can send sample emails
containing all variations of the text templates using
python manage.py sendnotifications --send-samples your@email.org
Creates a backup using a given Heroku app, downloads it and uses it to overwrite the local database schema and contents. User data is anonymized in the process, except for staff users, who are preserved.
todo: Requires heroku tools? Only for heroku based installations.
./scripts/fetch_database.sh fixmyplatform