The developers at Mystique Unicorn process files as soon as they arrive. They are facing issues with latencies at higher scale. They want to switch to an event-driven architecture with lower latencies to process files efficiently.
They heard about Azure's capabilities for event processing. Can you help them implement this event processing at Mystique Unicorn?
Our solution enables seamless event processing on Azure Blob Storage through the use of Azure Functions and Event Grid triggers. Azure generates blob storage events for events like blob creation and deletion. A sample event from event grid is shown below,
{
"id": "538fcf9f-3..-1024-801417067d3a",
"data": {
"api": "PutBlob",
"clientRequestId": "c0c0f290-ec..0bc9ef3b",
"requestId": "538fcf9f-3..01417000000",
"eTag": "0x8DB4E3BA4F8E488",
"contentType": "application/json",
"contentLength": 40,
"blobType": "BlockBlob",
"url": "https://warehouse6p5crf002.blob.core.windows.net/store-events-blob-002/source/7031_2023-05-06_event.json",
"sequencer": "0000000000000000000000.000005276ba",
"storageDiagnostics": { "batchId": "2901e730-b..-80d271000000" }
},
"topic": null,
"subject": "/blobServices/default/containers/store-events-blob-002/blobs/source/7031_2023-05-06_event.json",
"event_type": null
}
We can use this event as a trigger, retrieve the corresponding blob mentioned in data.url
using the input binding and persist the processed event back to Blob Storage using the output binding.
By leveraging the power of Bicep, all necessary resources can be easily provisioned and managed with minimal effort. Our solution uses Python for efficient event processing, allowing for quick and easy deployment of sophisticated event processing pipelines.
-
This demo, instructions, scripts and bicep template is designed to be run in
westeurope
. With few or no modifications you can try it out in other regions as well(Not covered here).- π Azure CLI Installed & Configured - Get help here
- π Bicep Installed & Configured - Get help here
- π VS Code & Bicep Extenstions - Get help here
-
-
Get the application code
https://github.com/miztiik/azure-blob-eventgrid-trigger-function cd azure-blob-eventgrid-trigger-function
-
-
Let check you have Azure Cli working with
# You should have azure cli preinstalled az account show
You should see an output like this,
{ "environmentName": "AzureCloud", "homeTenantId": "16b30820b6d3", "id": "1ac6fdbff37cd9e3", "isDefault": true, "managedByTenants": [], "name": "YOUR-SUBS-NAME", "state": "Enabled", "tenantId": "16b30820b6d3", "user": { "name": "miztiik@", "type": "user" } }
-
-
Stack: Main Bicep This will create the following resoureces
- General purpose Storage Account
- This will be used by Azure functions to store the function code
- Storage Account with blob container
- This will be used to store the events
- Event Grid Topic
- This will be used to trigger the Azure Function.
- Create a subscription to the topic, that filters for
Microsoft.Storage.BlobCreated
events specific to the blob container.
- Python Azure Function
- Input, Trigger, Output Binding to the blob container for events
# make deploy sh deployment_scripts/deploy.sh
After successfully deploying the stack, Check the
Resource Groups/Deployments
section for the resources. - General purpose Storage Account
-
-
-
Upload file(s) to blob
Get the storage account and container name from the output of the deployment. Upload a file to the container and check the logs of the function app to see the event processing in action.
Sample bash script to upload files to blob container. You can also upload manually from the portal,
# Set variables RESOURCE_GROUP="Miztiik_Enterprises_azure_blob_eventgrid_trigger_function_002" LOCATION="northeurope" SA_NAME="warehouse6p5crf002" CONTAINER_NAME="store-events-blob-002" for i in {1..2} do FILE_NAME_PREFIX=$(openssl rand -hex 4) FILE_NAME="${RANDOM}_$(date +'%Y-%m-%d')_event.json" echo -n "{\"message\": \"hello world on $(date +'%Y-%m-%d')\" , \"timestamp\": \"$(date -u +"%Y-%m-%dT%H:%M:%SZ")\"}" > ${FILE_NAME} az storage blob upload \ --account-name ${SA_NAME} \ --container-name ${CONTAINER_NAME} \ --name "source/${FILE_NAME}" \ --file ${FILE_NAME} \ --auth-mode login sleep 2 echo -e "\n\n---> ${FILE_NAME} uploaded to ${CONTAINER_NAME} in ${SA_NAME} storage account\n\n" done
You should see an output like this,
---> 27999_2023-05-06_event.json uploaded to store-events-blob-002 in warehouse6p5crf002 storage account
-
-
Here we have demonstrated trigger Azure functions with http trigger and process blob files. You can extend the solution and configure the function to send the events to other services like Event Hub, Service Bus, or persist them to Cosmos etc.
If you want to destroy all the resources created by the stack, Execute the below command to delete the stack, or you can delete the stack from console as well
- Resources created during Deploying The Application
- Any other custom resources, you have created for this demo
# Delete from resource group
az group delete --name Miztiik_Enterprises_xxx --yes
# Follow any on-screen prompt
This is not an exhaustive list, please carry out other necessary steps as maybe applicable to your needs.
This repository aims to show how to Bicep to new developers, Solution Architects & Ops Engineers in Azure.
Thank you for your interest in contributing to our project. Whether it is a bug report, new feature, correction, or additional documentation or solutions, we greatly value feedback and contributions from our community. Start here
Buy me a coffee β.
- Azure Event Grid trigger for Azure Functions
- Blob Storage events
- Azure Blob Storage Input Binding
- Azure Blob Storage Ouput Binding
- Azure Event Grid Filters
- Miztiik Blog - Blob Storage Event Processing with Python Azure Functions
- Miztiik Blog - Blob Storage Processing with Python Azure Functions with HTTP Triggers
Level: 200