This repository provides a Python logging handler, BlobStorageTimedRotatingFileHandler, that extends the functionality of the built-in TimedRotatingFileHandler. This handler automatically rotates log files at specified intervals and uploads the outdated log files to an Azure Storage Blob container. This ensures that your log history is retained in Azure Blob Storage while keeping only the latest log on the local file system.
-
Install the required packages using the command:
pip install azure-storage-blob pip install azure-identity pip install pylogger2azblob
-
Set an .env file with the following content:
NOTE: User must assign
Contiributor / Storage Blob Data Contributor
Role to<your-storage-account-name>
resource scope.####################################### # pytest settings ####################################### AZURE_BLOB_TESTLOG_FILE=pytest.log AZURE_BLOB_TESTLOG_DIR=./tests/testlog AZURE_STORAGE_TESTLOG_ACCOUNT_NAME=<your-storage-account-name> ####################################### # Logging settings ####################################### LOGGING_ACCOUNT_NAME=<your-storage-account-name> LOGGING_CONTAINER=<your-container-name> LOGGING_LEVEL=DEBUG LOGGING_FORMATTER=verbose LOGGING_FILENAME=./output.log LOGGING_WHEN=S LOGGING_INTERVAL=1
To demonstrate how to use the logging functionality provided by PYLOGGER2AZBLOB, follow the steps below:
-
Ensure you have completed the installation steps mentioned in the Installation/Configuration section of this README.
-
To read the contents of dotenv when executing the code below, please install the following:
pip install python-dotenv
-
Create a Python script, e.g.,
tutorial.py
, and copy the following code:import os import logging import pylogger2azblob from logging.config import dictConfig from dotenv import load_dotenv # Load environment variables from .env load_dotenv() LOGGING_ACCOUNT_NAME = os.getenv('LOGGING_ACCOUNT_NAME', '<your-storage-account>') LOGGING_CONTAINER = os.getenv('LOGGING_CONTAINER', '<your-container-name>') LOGGING_LEVEL = os.getenv('LOGGING_LEVEL', 'DEBUG') LOGGING_FORMATTER = os.getenv('LOGGING_FORMATTER', 'verbose') LOGGING_FILENAME = os.getenv('LOGGING_FILENAME', '<file-name-you-wanna-output>') LOGGING_WHEN = os.getenv('LOGGING_WHEN', 'S') LOGGING_INTERVAL = int(os.getenv('LOGGING_INTERVAL', 60)) LOGGING = { 'version': 1, 'formatters': { 'simple': { 'format': '%(asctime)s %(message)s', }, 'verbose': { 'format': '%(levelname)s %(hostname)s %(currenttime)s %(message)s', } }, 'handlers': { 'blob': { 'class': 'pylogger2azblob.handlers.BlobStorageTimedRotatingFileHandler', 'account_name': LOGGING_ACCOUNT_NAME, 'container': LOGGING_CONTAINER, 'level': LOGGING_LEVEL, 'formatter': LOGGING_FORMATTER, 'filename': LOGGING_FILENAME, 'when': LOGGING_WHEN, 'interval': LOGGING_INTERVAL } }, 'loggers': { 'example': { 'handlers': ['blob'], 'level': LOGGING_LEVEL, }, } } dictConfig(LOGGING) logger = logging.getLogger('example') logger.debug('debug message') logger.info('info message') logger.warning('warning message') logger.error('error message') logger.critical('critical message')
-
Execute the tutorial script using the following command:
python tutorial.py
This script configures a logger named
example
withBlobStorageTimedRotatingFileHandler
. It logs messages with different log levels (debug
,info
,warning
,error
, andcritical
) to showcase the functionality. -
Check the specified Azure Blob Storage container to verify that log files are created and rotated according to the specified configuration.
The result is as shown above because a dedicated storage account
kazuyalogstorage
and container nameinstance-jupyter-log
were specified for testing purposes intutorial.py
. When you executetutorial.py
, please confirm that log files are placed in the corresponding container with the storage account name and container name specified on the client side.
Now, you have successfully set up and used PYLOGGER2AZBLOB to log messages and store them in Azure Blob Storage! Feel free to customize the instructions based on your preferences or add any additional details you think are relevant.