-
-
Notifications
You must be signed in to change notification settings - Fork 118
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge branch 'dev' into eia860-2022-final-release
- Loading branch information
Showing
36 changed files
with
479 additions
and
163 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,57 @@ | ||
--- | ||
name: update-lockfile | ||
|
||
on: | ||
workflow_dispatch: | ||
# schedule: | ||
# At 5:28am UTC Monday and Thursday | ||
# - cron: 28 5 * * MON,THU | ||
|
||
jobs: | ||
conda-lock: | ||
# Don't run scheduled job on forks. | ||
if: (github.event_name == 'schedule' && github.repository == 'catalyst-cooperative/pudl') || (github.event_name != 'schedule') | ||
defaults: | ||
run: | ||
# Ensure the environment is activated | ||
# <https://github.com/mamba-org/provision-with-micromamba#important> | ||
shell: bash -l {0} | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/checkout@v4 | ||
- name: Install Micromamba | ||
uses: mamba-org/setup-micromamba@v1 | ||
with: | ||
environment-file: environments/conda-lock.yml | ||
environment-name: pudl-dev | ||
|
||
- name: Install pudl from branch | ||
run: pip install --editable "./[dev,docs,test,datasette]" | ||
|
||
- name: Run conda-lock to recreate lockfile from scratch | ||
run: | | ||
rm environments/conda-lock.yml | ||
conda-lock \ | ||
--file=environments/dev-environment.yml \ | ||
--file=pyproject.toml \ | ||
--lockfile=environments/conda-lock.yml | ||
- name: Open a pull request | ||
uses: peter-evans/create-pull-request@v5 | ||
with: | ||
# # The default GITHUB_TOKEN doesn't allow other workflows to trigger. | ||
# # Thus if there are tests to be run, they won't be run. For more info, | ||
# # see the note under | ||
# # <https://github.com/peter-evans/create-pull-request#action-inputs>. | ||
# # One possible workaround is to specify a Personal Access Token (PAT). | ||
# # This PAT should have read-write permissions for "Pull Requests" | ||
# # and read-write permissions for "Contents". | ||
# token: ${{ secrets.GH_PAT_FOR_PR }} | ||
commit-message: Update lockfile | ||
title: Update Lockfile | ||
body: > | ||
This pull request relocks the dependencies with conda-lock. | ||
It is triggered by [update-lockfile](https://github.com/catalyst-cooperative/pudl/blob/main/.github/workflows/update-lockfile.yml). | ||
branch: update-lockfile | ||
labels: dependencies, conda-lock | ||
reviewers: zaneselvans | ||
delete-branch: true |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,34 @@ | ||
# fly.toml app configuration file generated for catalyst-coop-pudl on 2023-11-03T15:31:15-04:00 | ||
# | ||
# See https://fly.io/docs/reference/configuration/ for information about how to use this file. | ||
# | ||
app = "catalyst-coop-pudl" | ||
primary_region = "bos" | ||
|
||
[[mounts]] | ||
destination = "/data" | ||
source = "datasette" | ||
|
||
[[services]] | ||
internal_port = 8080 | ||
protocol = "tcp" | ||
|
||
[services.concurrency] | ||
hard_limit = 25 | ||
soft_limit = 20 | ||
|
||
[[services.ports]] | ||
handlers = ["http"] | ||
port = 80 | ||
|
||
[[services.ports]] | ||
handlers = ["tls", "http"] | ||
port = 443 | ||
|
||
[[services.tcp_checks]] | ||
grace_period = "1m" | ||
interval = 10000 | ||
timeout = 2000 | ||
|
||
[deploy] | ||
wait_timeout = "15m" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
#! /usr/bin/env bash | ||
set -eux | ||
|
||
shopt -s nullglob | ||
|
||
find /data/ -name '*.sqlite' -delete | ||
mv all_dbs.tar.zst /data | ||
zstd -f -d /data/all_dbs.tar.zst -o /data/all_dbs.tar | ||
tar -xf /data/all_dbs.tar --directory /data | ||
datasette serve --host 0.0.0.0 /data/*.sqlite --cors --inspect-file inspect-data.json --metadata metadata.yml --setting sql_time_limit_ms 5000 --port $PORT |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,122 @@ | ||
"""Publish the datasette to fly.io. | ||
We use custom logic here because the datasette-publish-fly plugin bakes the | ||
uncompressed databases into the image, which makes the image too large. | ||
We compress the databases before baking them into the image. Then we decompress | ||
them at runtime to a Fly volume mounted at /data. This avoids a long download | ||
at startup, and allows us stay within the Fly.io 8GB image size limit. | ||
The volume handling is done manually outside of this publish.py script - it | ||
should be terraformed at some point. | ||
Some static fly.io deployment-related files live in ./fly: | ||
* fly.toml - service configuration | ||
* run.sh - service entrypoint | ||
Apart from that: the Dockerfile and dataset-specific | ||
metadata.yml/inspect-data.json are generated by this script. | ||
""" | ||
|
||
import json | ||
import logging | ||
import secrets | ||
from pathlib import Path | ||
from subprocess import check_call, check_output | ||
|
||
from pudl.metadata.classes import DatasetteMetadata | ||
from pudl.workspace.setup import PudlPaths | ||
|
||
logging.basicConfig(format="%(asctime)s %(message)s", level=logging.INFO) | ||
|
||
DOCKERFILE_TEMPLATE = """ | ||
FROM python:3.11.0-slim-bullseye | ||
COPY . /app | ||
WORKDIR /app | ||
RUN apt-get update | ||
RUN apt-get install -y zstd | ||
ENV DATASETTE_SECRET '{datasette_secret}' | ||
RUN pip install -U datasette datasette-cluster-map datasette-vega datasette-block-robots | ||
ENV PORT 8080 | ||
EXPOSE 8080 | ||
CMD ["./run.sh"] | ||
""" | ||
|
||
|
||
def make_dockerfile(): | ||
"""Write a dockerfile from template, to use in fly deploy. | ||
We write this from template so we can generate a datasette secret. This way | ||
we don't have to manage secrets at all. | ||
""" | ||
datasette_secret = secrets.token_hex(16) | ||
return DOCKERFILE_TEMPLATE.format(datasette_secret=datasette_secret) | ||
|
||
|
||
def inspect_data(datasets, pudl_out): | ||
"""Pre-inspect databases to generate some metadata for Datasette. | ||
This is done in the image build process in datasette-publish-fly, but since | ||
we don't have access to the databases in the build process we have to | ||
inspect before building the Docker image. | ||
""" | ||
inspect_output = json.loads( | ||
check_output( | ||
[ # noqa: S603 | ||
"datasette", | ||
"inspect", | ||
] | ||
+ [str(pudl_out / ds) for ds in datasets] | ||
) | ||
) | ||
|
||
for dataset in inspect_output: | ||
name = Path(inspect_output[dataset]["file"]).name | ||
new_filepath = Path("/data") / name | ||
inspect_output[dataset]["file"] = str(new_filepath) | ||
return inspect_output | ||
|
||
|
||
def metadata(pudl_out) -> str: | ||
"""Return human-readable metadata for Datasette.""" | ||
return DatasetteMetadata.from_data_source_ids(pudl_out).to_yaml() | ||
|
||
|
||
def main(): | ||
"""Generate deployment files and run the deploy.""" | ||
fly_dir = Path(__file__).parent.absolute() / "fly" | ||
docker_path = fly_dir / "Dockerfile" | ||
inspect_path = fly_dir / "inspect-data.json" | ||
metadata_path = fly_dir / "metadata.yml" | ||
|
||
pudl_out = PudlPaths().pudl_output | ||
datasets = [str(p.name) for p in pudl_out.glob("*.sqlite")] | ||
logging.info(f"Inspecting DBs for datasette: {datasets}...") | ||
inspect_output = inspect_data(datasets, pudl_out) | ||
with inspect_path.open("w") as f: | ||
f.write(json.dumps(inspect_output)) | ||
|
||
logging.info("Writing metadata...") | ||
with metadata_path.open("w") as f: | ||
f.write(metadata(pudl_out)) | ||
|
||
logging.info("Writing Dockerfile...") | ||
with docker_path.open("w") as f: | ||
f.write(make_dockerfile()) | ||
|
||
logging.info(f"Compressing {datasets} and putting into docker context...") | ||
check_call( | ||
["tar", "-a", "-czvf", fly_dir / "all_dbs.tar.zst"] + datasets, # noqa: S603 | ||
cwd=pudl_out, | ||
) | ||
|
||
logging.info("Running fly deploy...") | ||
check_call(["/usr/bin/env", "flyctl", "deploy"], cwd=fly_dir) # noqa: S603 | ||
logging.info("Deploy finished!") | ||
|
||
|
||
if __name__ == "__main__": | ||
main() |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.