Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mark task as failed when it fails "sending" in Celery #10881

Merged
merged 1 commit into from
Sep 14, 2020

Commits on Sep 14, 2020

  1. Mark task as failed when it fails sending in Celery

    If a task failed hard on celery, _before_ being able to execute the
    airflow code the task would end up stuck in queued state. This change
    makes it get retried.
    
    This was discovered in load testing the HA work (but unrelated to HA
    changes), where I swamped the kube-dns pod, meaning the worker was
    sometimes unable to resolve the db name via DNS, so the state in the DB
    was never updated
    ashb committed Sep 14, 2020
    Configuration menu
    Copy the full SHA
    77cb73d View commit details
    Browse the repository at this point in the history