Skip to content

Commit

Permalink
add and activate point robinson & sunset bay hydrophones (#136)
Browse files Browse the repository at this point in the history
* add and activate sunset-bay hydrophone

* add and activate point robinson hydrophone
  • Loading branch information
salsal97 committed Nov 4, 2023
1 parent 72fe3d5 commit 712c693
Show file tree
Hide file tree
Showing 10 changed files with 163 additions and 19 deletions.
38 changes: 26 additions & 12 deletions InferenceSystem/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,16 @@ Note: We use Python 3, specifically tested with Python 3.7.4
# How to run the InferenceSystem locally
## Create a virtual environment

1. In your working directory, run `python -m venv inference-venv`. This creates a directory `inference-venv` with relevant files/scripts.
2. On Mac, activate this environment with `source inference-venv/bin/activate` and when you're done, `deactivate`
1. In your working directory, run `pip install virtualenv && virtualenv inference-venv`. This creates a directory `inference-venv` with relevant files/scripts.
2. On Mac or Linux, activate this environment with `source inference-venv/bin/activate` and when you're done, `deactivate`

On Windows, activate with `.\inference-venv\Scripts\activate.bat` and `.\inference-venv\Scripts\deactivate.bat` when done
3. In an active environment, cd to `/InferenceSystem` and run `python -m pip install --upgrade pip && pip install -r requirements.txt`

## Model download

1. Download the current production model from [this link.](https://trainedproductionmodels.blob.core.windows.net/dnnmodel/11-15-20.FastAI.R1-12.zip)
2. Unzip *.zip and extract to `InferenceSystem/model`.
2. Unzip *.zip and extract to `InferenceSystem/model` using `unzip 11-15-20.FastAI.R1-12.zip`
3. Check the contents of `InferenceSystem/model`.
There should be 1 file
* model.pkl
Expand Down Expand Up @@ -149,8 +149,21 @@ This can be completed in two ways.
```
AZURE_COSMOSDB_PRIMARY_KEY=<key>
AZURE_STORAGE_CONNECTION_STRING=<string>
INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING=<string>
```

## Adding a new hydrophone

1. Create a new config file under the [config](config) folder

2. Update the last line of the [Dockerfile](Dockerfile) to point to the new config file

3. Create a new deployment YAML under the [deploy](deploy) folder

4. Update [src/LiveInferenceOrchestrator.py](src/LiveInferenceOrchestrator.py) and [src/globals.py](src/globals.py) to add variables for the new hydrophone location

5. Follow all other steps below until you update the kubernetes cluster with the new namespace

## Building the docker container for production

From the `InferenceSystem` directory, run the following command.
Expand All @@ -163,7 +176,8 @@ docker build . -t live-inference-system -f ./Dockerfile

Note: the config used in the Dockerfile is a Production config.

TODO: fix. For now, you will have to manually create 3 different docker containers for the 3 hydrophone locations. Each time you will need to edit the Dockerfile and replace the config for each hydrophone location (OrcasoundLab, BushPoint, PortTownsend).
TODO: fix. For now, you will have to manually create 5 different docker containers for the 5 hydrophone locations. Each time you will need to edit the Dockerfile and replace the config for each hydrophone location (OrcasoundLab, BushPoint, PortTownsend, Sunset Bay and Point Robinson).


## Running the docker container

Expand Down Expand Up @@ -201,23 +215,21 @@ This step pushes your local container to the Azure Container Registry (ACR). If
documentation is adapted from
[this tutorial](https://docs.microsoft.com/en-us/azure/container-instances/container-instances-tutorial-prepare-acr).

Login to the shared azure directory from the Azure CLI.
1. Login to the shared azure directory from the Azure CLI.

```
az login --tenant ai4orcasoutlook.onmicrosoft.com
az login --tenant adminorcasound.onmicrosoft.com
```

We will be using the orcaconservancycr ACR in the LiveSRKWNotificationSystem Resource Group.

Log in to the container registry.
2. We will be using the orcaconservancycr ACR in the LiveSRKWNotificationSystem Resource Group. Log in to the container registry.

```
az acr login --name orcaconservancycr
```

You should receive something similar to `Login succeeded`.

Tag your docker container with the version number. We use the following versioning scheme.
3. Tag your docker container with the version number. We use the following versioning scheme.

```
docker tag live-inference-system orcaconservancycr.azurecr.io/live-inference-system:<date-of-deployment>.<model-type>.<Rounds-trained-on>.<hydrophone-location>.v<Major>
Expand All @@ -230,7 +242,8 @@ docker tag live-inference-system orcaconservancycr.azurecr.io/live-inference-sys
```

Look at [deploy-aci.yaml](deploy-aci.yaml) for examples of how previous models were tagged.
Lastly, push your image to Azure Container Registry for each Orcasound Hydrophone Location.

4. Lastly, push your image to Azure Container Registry for each Orcasound Hydrophone Location.

```
docker push orcaconservancycr.azurecr.io/live-inference-system:<date-of-deployment>.<model-type>.<Rounds-trained-on>.<hydrophone-location>.v<Major>
Expand Down Expand Up @@ -284,7 +297,7 @@ kubectl create secret generic inference-system -n bush-point \
4. Create or update deployment. Use file for hydrophone under [deploy](./deploy/) folder, or create and commit a new one.

```bash
kubectl apply -f bush-point.yaml
kubectl apply -f deploy/bush-point.yaml
```

5. To verify that the container is running, check logs:
Expand All @@ -300,6 +313,7 @@ kubectl logs -n bush-point inference-system-6d4845c5bc-tfsbw
<details>
<summary>Deployment to Azure Container Instances (deprecated)</summary>
# Deploying an updated docker build to Azure Container Instances
# This method has been deprecated

## Prerequisites

Expand Down
10 changes: 10 additions & 0 deletions InferenceSystem/config/Production/FastAI_LiveHLS_PointRobinson.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
model_type: "FastAI"
model_local_threshold: 0.5
model_global_threshold: 3
model_path: "./model"
model_name: "model.pkl"
hls_stream_type: "LiveHLS"
hls_polling_interval: 60
hls_hydrophone_id: "rpi_point_robinson"
upload_to_azure: True
delete_local_wavs: True
10 changes: 10 additions & 0 deletions InferenceSystem/config/Production/FastAI_LiveHLS_SunsetBay.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
model_type: "FastAI"
model_local_threshold: 0.5
model_global_threshold: 3
model_path: "./model"
model_name: "model.pkl"
hls_stream_type: "LiveHLS"
hls_polling_interval: 60
hls_hydrophone_id: "rpi_sunset_bay"
upload_to_azure: True
delete_local_wavs: True
1 change: 1 addition & 0 deletions InferenceSystem/demos/hls_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
# 'OrcasoundLab': 'https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_orcasound_lab'
'BushPoint': 'https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_bush_point'
# 'PortTownsend': 'https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_port_townsend'
'SunsetBay': 'https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_sunset_bay'
}

dirname = os.getcwd()
Expand Down
30 changes: 30 additions & 0 deletions InferenceSystem/deploy-aci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,36 @@ properties:
requests:
cpu: 1.0
memoryInGB: 3
- name: sunset-bay-live
properties:
environmentVariables:
- name: AZURE_COSMOSDB_PRIMARY_KEY
secureValue: '<cosmos_primary_key>'
- name: AZURE_STORAGE_CONNECTION_STRING
secureValue: '<storage_connection_string>'
- name: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
secureValue: '<appinsights_connection_string>'
image: orcaconservancycr.azurecr.io/live-inference-system:11-15-20.FastAI.R1-12.SunsetBay.v0
ports: []
resources:
requests:
cpu: 1.0
memoryInGB: 3
- name: point-robinson-live
properties:
environmentVariables:
- name: AZURE_COSMOSDB_PRIMARY_KEY
secureValue: '<cosmos_primary_key>'
- name: AZURE_STORAGE_CONNECTION_STRING
secureValue: '<storage_connection_string>'
- name: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
secureValue: '<appinsights_connection_string>'
image: orcaconservancycr.azurecr.io/live-inference-system:11-15-20.FastAI.R1-12.PointRobinson.v0
ports: []
resources:
requests:
cpu: 1.0
memoryInGB: 3
imageRegistryCredentials:
- server: orcaconservancycr.azurecr.io
username: orcaconservancycr
Expand Down
38 changes: 38 additions & 0 deletions InferenceSystem/deploy/point-robinson.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: inference-system
namespace: point-robinson
spec:
replicas: 1
selector:
matchLabels:
app: inference-system
template:
metadata:
labels:
app: inference-system
spec:
containers:
- name: inference-system
image: orcaconservancycr.azurecr.io/live-inference-system:11-15-20.FastAI.R1-12.PointRobinson.v0
resources:
limits:
cpu: 1
memory: 3G
env:
- name: AZURE_COSMOSDB_PRIMARY_KEY
valueFrom:
secretKeyRef:
name: inference-system
key: AZURE_COSMOSDB_PRIMARY_KEY
- name: AZURE_STORAGE_CONNECTION_STRING
valueFrom:
secretKeyRef:
name: inference-system
key: AZURE_STORAGE_CONNECTION_STRING
- name: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
valueFrom:
secretKeyRef:
name: inference-system
key: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
38 changes: 38 additions & 0 deletions InferenceSystem/deploy/sunset-bay.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
apiVersion: apps/v1
kind: Deployment
metadata:
name: inference-system
namespace: sunset-bay
spec:
replicas: 1
selector:
matchLabels:
app: inference-system
template:
metadata:
labels:
app: inference-system
spec:
containers:
- name: inference-system
image: orcaconservancycr.azurecr.io/live-inference-system:11-15-20.FastAI.R1-12.SunsetBay.v0
resources:
limits:
cpu: 1
memory: 3G
env:
- name: AZURE_COSMOSDB_PRIMARY_KEY
valueFrom:
secretKeyRef:
name: inference-system
key: AZURE_COSMOSDB_PRIMARY_KEY
- name: AZURE_STORAGE_CONNECTION_STRING
valueFrom:
secretKeyRef:
name: inference-system
key: AZURE_STORAGE_CONNECTION_STRING
- name: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
valueFrom:
secretKeyRef:
name: inference-system
key: INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING
1 change: 0 additions & 1 deletion InferenceSystem/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,5 @@ numba==0.48
opencv-python
boto3
pytz
json
opencensus-ext-azure
orca-hls-utils
14 changes: 8 additions & 6 deletions InferenceSystem/src/LiveInferenceOrchestrator.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
from azure.cosmos import exceptions, CosmosClient, PartitionKey

import sys

import logging
from opencensus.ext.azure.log_exporter import AzureLogHandler
from opencensus.ext.azure.log_exporter import AzureEventHandler

Expand All @@ -36,8 +36,10 @@
ORCASOUND_LAB_LOCATION = {"id": "rpi_orcasound_lab", "name": "Haro Strait", "longitude": -123.17357, "latitude": 48.55833}
PORT_TOWNSEND_LOCATION = {"id": "rpi_port_townsend", "name": "Port Townsend", "longitude": -122.76045, "latitude": 48.13569}
BUSH_POINT_LOCATION = {"id": "rpi_bush_point", "name": "Bush Point", "longitude": -122.6039, "latitude": 48.03371}
SUNSET_BAY_LOCATION = {"id": "rpi_sunset_bay", "name": "Sunset Bay", "longitude": -122.3339, "latitude": 47.86497}
POINT_ROBINSON_LOCATION = {"id": "rpi_point_robinson", "name": "Point Robinson", "longitude": -122.37267, "latitude": 47.38838}

source_guid_to_location = {"rpi_orcasound_lab" : ORCASOUND_LAB_LOCATION, "rpi_port_townsend" : PORT_TOWNSEND_LOCATION, "rpi_bush_point": BUSH_POINT_LOCATION}
source_guid_to_location = {"rpi_orcasound_lab" : ORCASOUND_LAB_LOCATION, "rpi_port_townsend" : PORT_TOWNSEND_LOCATION, "rpi_bush_point": BUSH_POINT_LOCATION, "rpi_sunset_bay": SUNSET_BAY_LOCATION, "rpi_point_robinson": POINT_ROBINSON_LOCATION }

def assemble_blob_uri(container_name, item_name):

Expand Down Expand Up @@ -99,11 +101,11 @@ def populate_metadata_json(
config_params = yaml.load(f, Loader=yaml.FullLoader)

# logger to app insights
con_string = os.getenv('INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING')
app_insights_connection_string = os.getenv('INFERENCESYSTEM_APPINSIGHTS_CONNECTION_STRING')
logger = logging.getLogger(__name__)
if appInsightsKey is not None:
logger.addHandler(AzureLogHandler(connection_string=con_string))
logger.addHandler(AzureEventHandler(connection_string=con_string))
if app_insights_connection_string is not None:
logger.addHandler(AzureLogHandler(connection_string=app_insights_connection_string))
logger.addHandler(AzureEventHandler(connection_string=app_insights_connection_string))
logger.setLevel(logging.INFO)

## Model Details
Expand Down
2 changes: 2 additions & 0 deletions InferenceSystem/src/globals.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@
"orcasound_lab": "https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_orcasound_lab",
"port_townsend": "https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_port_townsend",
"bush_point": "https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_bush_point",
"sunset_bay": "https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_sunset_bay",
"point_robinson": "https://s3-us-west-2.amazonaws.com/streaming-orcasound-net/rpi_point_robinson",
}

# Limits time window (end - start) of negative samples to be downloaded for retraining
Expand Down

0 comments on commit 712c693

Please sign in to comment.