You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After upgrading or installing airflow 2.3.3 the remote_logging in airflow.cfg cant be set to true without creating depreciation warning.
I'm using remote logging to an s3 bucket.
It doesn't matter which version of apache-airflow-providers-amazon i have installed.
When using systemd units to start the airflow components, the webserver will spam the depreciation warning every second.
Tested with Python 3.10 and 3.7.3
What you think should happen instead
When using the remote logging
It should not execute an action every second in the background which seems to be deprecated.
How to reproduce
You could quickly install an Python virtual Environment on a machine of you choice.
After that install airflow and apache-airflow-providers-amazon over pip
Then change the logging part in the airflow.cfg: [logging]
remote_logging = True
create a testdag.py containing at least: from airflow import DAG
run it with Python to see the errors:
python testdag.py
hint: some more deprecationWarnings will appear because the standard airflow.cfg which get created when installing airflow is not the current state.
The Deprication warning you should see when turning remote_logging to true is: .../lib/python3.10/site-packages/airflow/utils/log/file_task_handler.py:52 DeprecationWarning: Passing filename_template to FileTaskHandler is deprecated and has no effect
Operating System
Debian GNU/Linux 10 (buster) and also tested Fedora release 36 (Thirty Six)
Versions of Apache Airflow Providers
apache-airflow-providers-amazon 4.0.0
Deployment
Virtualenv installation
Deployment details
Running a small setup. 2 Virtual Machines. Airflow installed over pip inside a Python virtual environment.
Anything else
The Problem occurs every dag run and it gets logged every second inside the journal produced by the webserver systemd unit.
Apache Airflow version
2.3.3
What happened
After upgrading or installing airflow 2.3.3 the remote_logging in airflow.cfg cant be set to true without creating depreciation warning.
I'm using remote logging to an s3 bucket.
It doesn't matter which version of apache-airflow-providers-amazon i have installed.
When using systemd units to start the airflow components, the webserver will spam the depreciation warning every second.
Tested with Python 3.10 and 3.7.3
What you think should happen instead
When using the remote logging
It should not execute an action every second in the background which seems to be deprecated.
How to reproduce
You could quickly install an Python virtual Environment on a machine of you choice.
After that install airflow and apache-airflow-providers-amazon over pip
Then change the logging part in the airflow.cfg:
[logging]
remote_logging = True
create a testdag.py containing at least: from airflow import DAG
run it with Python to see the errors:
python testdag.py
hint: some more deprecationWarnings will appear because the standard airflow.cfg which get created when installing airflow is not the current state.
The Deprication warning you should see when turning remote_logging to true is:
.../lib/python3.10/site-packages/airflow/utils/log/file_task_handler.py:52 DeprecationWarning: Passing filename_template to FileTaskHandler is deprecated and has no effect
Operating System
Debian GNU/Linux 10 (buster) and also tested Fedora release 36 (Thirty Six)
Versions of Apache Airflow Providers
apache-airflow-providers-amazon 4.0.0
Deployment
Virtualenv installation
Deployment details
Running a small setup. 2 Virtual Machines. Airflow installed over pip inside a Python virtual environment.
Anything else
The Problem occurs every dag run and it gets logged every second inside the journal produced by the webserver systemd unit.
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: