Infrastructure for storing IceNet predictions and importing them into a database. This is part of the IceNet project.
You will need to install the following in order to use this package:
- A
Microsoft Azure
account with at leastContributor
permissions on theIceNet
subscription
Python 3.8
or above
Install Python
requirements with the following:
pip install --upgrade pip setuptools wheel
pip install -r requirements.txt
- Run the
Terraform
setup script./setup_terraform.py
like so:
./setup_terraform.py -v \
-i [[admin_subnets]] \
-s [[subscription_name]] \
-rg [[state_resourcegroupname]] \
-sa [[state_accountname]] \
-sc [[state_containername]] \
[[docker_login]] \
[[notification_email]]
You can specify the environment with -e [[ENV]]
which defaults to dev
- Enter the
terraform
directory withcd terraform
- Initialise
Terraform
by runningterraform init
like so:
terraform init -backend-config=backend.[[ENV]].secrets \
-backend-config='storage_account_name=[[state_accountname]]' \
-backend-config='container_name=[[state_containername]]'
- Check the actions that
Terraform
will carry out by runningterraform plan -var-file=azure.[[ENV]].secrets
- Deploy using
Terraform
by runningterraform apply -var-file=azure.[[ENV]].secrets
- Switch environments by calling
terraform init
again
Note that a full run from fresh will likely fail and the apply need rerunning, because we've not sorted all the resource chaining out yet
This is a WIP issue, the processing application needs to be deployed before a final run to deploy a function app that can be targeted by the event grid subscription. See this github issue
This is not achievable via the terraform provider yet, so you'll need to provision the email domain for sending manually,
connect it to the comms provider and then add the notification_email address to the azure.[[ENV]].secrets
file.
In order to process NetCDF
files created by the IceNet pipeline, these need to be uploaded to the blob storage created by the Terraform
commands above.
Follow the instructions here to generate tokens for the blob storage at:
- resource group:
rg-icenet[[ENV]]-data
- storage account:
sticenet[[ENV]]data
- storage container:
input
The SAS token will need: Create
, Write
, Add
and List
permissions.
Every time a file is uploaded to the blob storage container it will trigger a run of the processing function. It is possible that the processing might fail, for example if the file is malformed or the process runs out of memory. To retry a failed run, do one of the following:
- delete the blob and then reupload it
- add metadata to the blob
Other methods are possible (for example interfacing with blob receipts) but these are more complicated.
In order to provide access to the NetCDF
files stored in blob storage another SAS token will be needed.
Follow the instructions here to generate tokens for the blob storage at:
- resource group:
rg-icenet[[ENV]]-data
- storage account:
sticenet[[ENV]]data
- storage container:
input
The SAS token will need: Read
and List
permissions.
make deploy-azure
basically deploys each of the applications from each of these repositories
- icenet-geoapi: secrets file is output to repository if variable is set,
- icenet-geoapi-processing: deployment to the function app is manual, which will be required as mentioned above
- icenet-event-processor: this is deployed from docker or updated in the container if using dev
- icenet-application: this is deployed from github using deploy/sync via a source manual integration
This needs to be done manually, because with a properly configured perimeter this will be delegated to a secure internal host. If this were in terraform, it would make the implementation much more of a mission!
The following are noted from a recent deployment
You have to provision manually the email domain on icenet[ENV]-emails Then connect the icenet[ENV]-emails service to icenet[ENV]-comms
Then you need to deploy the applications
icenet-geoapi-processing
export ICENET_ENV=uat
make deploy-azure
icenet-application
export ICENET_ENV=uat
make deploy-azure
- from https://github.com/icenet-ai/icenet-application ORmake deploy-zip
- from the local clone
icenet-event-processor
- In theory deploys from the docker registry image
There's no incremental versioning at present.
v0.0.1 refers to the ongoing development until we move into demo usage, at which point this will be reviewed...
This is licensed using the MIT License