Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add prettier pre-commit hook #608

Merged
merged 5 commits into from
Feb 18, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ jobs:
echo "adding micromamba to path: $MAMBA_EXE"

pip install --no-deps git+https://git@github.com/regro/libcflib@master

for f in ./plugins/quetz_*
do
echo "::group::Testing plugin ${f}"
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/docker-build-push.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ on:
branches:
- main
tags:
- '**'
- "**"

jobs:
docker:
Expand Down
4 changes: 3 additions & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: mamba-org/provision-with-micromamba@v15
- uses: pre-commit/action@v2.0.0
with:
extra_args: --all-files --show-diff-on-failure
env:
PRE_COMMIT_USE_MICROMAMBA: 1
5 changes: 2 additions & 3 deletions .github/workflows/sphinx.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@ name: "sphinx docs check"
on:
push:
branches:
- main
- main
pull_request:
branches:
- main
- main

jobs:
docs:
Expand All @@ -27,4 +27,3 @@ jobs:
with:
name: DocumentationHTML
path: docs/build/html

5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,7 @@ test_quetz

# IDE
.idea/
.vscode/
.vscode/

.env
.envrc
4 changes: 4 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -32,3 +32,7 @@ repos:
- types-toml
- types-ujson
args: [--show-error-codes]
- repo: https://github.com/Quantco/pre-commit-mirrors-prettier
rev: 2.7.1
hooks:
- id: prettier-conda
10 changes: 5 additions & 5 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@ formatting checks prior to making a commit.

Make sure to include the following with your pull requests:

* description,
* links to relevant issues,
* any necessary code changes,
* unit tests for these changes,
* changes to relevant documentation sections
- description,
- links to relevant issues,
- any necessary code changes,
- unit tests for these changes,
- changes to relevant documentation sections
30 changes: 15 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,9 @@ quetz run test_quetz --copy-conf ./dev_config.toml --dev --reload

Links:

* <http://localhost:8000/> - Login with your github account
* <http://localhost:8000/api/dummylogin/alice> - Login with test user, one of [alice | bob | carol | dave]
* <http://localhost:8000/docs> - Swagger UI for this REST service
- <http://localhost:8000/> - Login with your github account
- <http://localhost:8000/api/dummylogin/alice> - Login with test user, one of [alice | bob | carol | dave]
- <http://localhost:8000/docs> - Swagger UI for this REST service

Download `xtensor` as test package:

Expand Down Expand Up @@ -191,7 +191,7 @@ Finally, you can create and run a new quetz deployment based on this configurati
quetz run postgres_quetz --copy-conf config_postgres.toml
```

Note that this recipe will create an ephemeral PostgreSQL database and it will delete all data after the `some-postgres` container is stopped and removed. To make the data persistent, please check the documentation of the `postgres` [image](https://hub.docker.com/_/postgres/) or your container orchestration system (Kubernetes or similar).
Note that this recipe will create an ephemeral PostgreSQL database and it will delete all data after the `some-postgres` container is stopped and removed. To make the data persistent, please check the documentation of the `postgres` [image](https://hub.docker.com/_/postgres/) or your container orchestration system (Kubernetes or similar).

### Running tests with PostgreSQL backend

Expand Down Expand Up @@ -258,17 +258,17 @@ In order to generate an API key the following must be true:
2. The user must be part of the target channel (you might need to create a channel first, see the previous section on how to create a channel via the swagger docs)
3. Go to the swagger docs at `<deployment url>:<port>/docs` and POST to `/api/api-keys`:

```json
{
"description": "my-test-token",
"roles": [
{
"role": "owner",
"channel": "my-channel"
}
]
}
```
```json
{
"description": "my-test-token",
"roles": [
{
"role": "owner",
"channel": "my-channel"
}
]
}
```

4. Then, GET on `/api/api-keys` to retrieve your token
5. Finally, set this value to QUETZ_API_KEY so you can use quetz-client to interact with the server.
Expand Down
2 changes: 1 addition & 1 deletion RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Python package. Before generating a package, we first need to install `build`.
pip install build twine
```

To create a Python source package (``.tar.gz``) and the binary package (`.whl`) in the `dist/` directory, do:
To create a Python source package (`.tar.gz`) and the binary package (`.whl`) in the `dist/` directory, do:

```bash
python -m build
Expand Down
31 changes: 20 additions & 11 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,48 +1,58 @@
version: '3'
version: "3"

services:
web:
image: quetz_dev_image
container_name: quetz-web
ports:
- "8000:8000"
build:
build:
context: .
dockerfile: ./docker/Dockerfile
environment:
- HTTPX_LOG_LEVEL=TRACE
command: ["sh", "-c", "./docker/wait-for-postgres.sh database && quetz start /quetz-deployment --host 0.0.0.0 --port 8000 --reload"]
depends_on:
command:
[
"sh",
"-c",
"./docker/wait-for-postgres.sh database && quetz start /quetz-deployment --host 0.0.0.0 --port 8000 --reload",
]
depends_on:
- database
- init-db
volumes:
- .:/code
- quetz_deployment:/quetz-deployment
env_file:
- docker/postgres.env
- docker/postgres.env
init-db:
image: quetz_dev_image
command: ["sh", "-c", "./docker/wait-for-postgres.sh database && quetz create /quetz-deployment --copy-conf /code/docker/docker_config.toml --exists-ok && quetz init-db /quetz-deployment"]
depends_on:
command:
[
"sh",
"-c",
"./docker/wait-for-postgres.sh database && quetz create /quetz-deployment --copy-conf /code/docker/docker_config.toml --exists-ok && quetz init-db /quetz-deployment",
]
depends_on:
- database
volumes:
- .:/code
- quetz_deployment:/quetz-deployment
env_file:
- docker/postgres.env
- docker/postgres.env
database:
image: postgres
volumes:
- ./docker/postgres.conf:/etc/postgresql/postgresql.conf
env_file:
- docker/postgres.env
- docker/postgres.env
prometheus:
image: prom/prometheus
volumes:
- ./docker/prometheus.yml:/etc/prometheus/prometheus.yml
grafana:
image: grafana/grafana
ports:
ports:
- 3000:3000
volumes:
- ./docker/graphana_datasources.yml:/etc/grafana/provisioning/datasources/datasource.yaml
Expand All @@ -69,4 +79,3 @@ services:

volumes:
quetz_deployment:

6 changes: 3 additions & 3 deletions docker/prometheus.yml
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
global:
scrape_interval: 15s # By default, scrape targets every 15 seconds.
scrape_interval: 15s # By default, scrape targets every 15 seconds.
# A scrape configuration containing exactly one endpoint to scrape:
# Here it's Prometheus itself.
scrape_configs:
# The job name is added as a label `job=<job_name>` to any timeseries scraped from this config.
- job_name: 'quetz'
- job_name: "quetz"
metrics_path: /metricsp

# Override the global default and scrape targets from this job every 5 seconds.
scrape_interval: 5s

static_configs:
- targets: ['web:8000']
- targets: ["web:8000"]
2 changes: 1 addition & 1 deletion environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ dependencies:
- importlib_metadata
- pre-commit
- pip:
- git+https://github.com/jupyter-server/jupyter_releaser.git@v2
- git+https://github.com/jupyter-server/jupyter_releaser.git@v2
2 changes: 1 addition & 1 deletion plugins/quetz_content_trust/tests/data/key_mgr.json
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,4 @@
"type": "key_mgr",
"version": 1
}
}
}
2 changes: 1 addition & 1 deletion plugins/quetz_content_trust/tests/data/root.json
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@
"type": "root",
"version": 1
}
}
}
3 changes: 1 addition & 2 deletions plugins/quetz_repodata_patching/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
# quetz_repodata_patching plugin

A plugin for [quetz](https://github.com/mamba-org/quetz) package server that implements the repodata patching. Repodata patching allow for hotfixing package index by changing some metadata in ``repodata.json`` files on the fly. For more information, see [conda-build](https://docs.conda.io/projects/conda-build/en/latest/concepts/generating-index.html#repodata-patching) docs.

A plugin for [quetz](https://github.com/mamba-org/quetz) package server that implements the repodata patching. Repodata patching allow for hotfixing package index by changing some metadata in `repodata.json` files on the fly. For more information, see [conda-build](https://docs.conda.io/projects/conda-build/en/latest/concepts/generating-index.html#repodata-patching) docs.

## Installing

Expand Down
3 changes: 1 addition & 2 deletions plugins/quetz_runexports/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# quetz_runexports

Quetz plugin to extract and expose `run_exports` from package files.
Quetz plugin to extract and expose `run_exports` from package files.

## Installation

Expand All @@ -12,7 +12,6 @@ pip install -e .

The plugin should be automatically integrated with quetz server, when you start it.


## Usage

To retrieve the `run_exports` make a GET requests on the following endpoint:
Expand Down
2 changes: 1 addition & 1 deletion plugins/quetz_sql_authenticator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,4 @@ The authenticator provides REST routes to create, update, and delete credentials

`PUT /api/sqlauth/credentials/{username}?password={password}`: Update a user's password.

`DELETE /api/sqlauth/credentials/{username}`: Delete a user.
`DELETE /api/sqlauth/credentials/{username}`: Delete a user.
1 change: 0 additions & 1 deletion plugins/quetz_transmutation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

This is a plugin to use with the [quetz](https://github.com/mamba-org/quetz) package server.


## Installing

To install use:
Expand Down
Loading