Add retry (exponential backoff) for storage api functions #108
Labels
api: storage
Issues related to the googleapis/python-storage API.
type: feature request
‘Nice-to-have’ improvement, new feature or different behavior or design.
While using the python-storage library, we were getting 503 or 504 errors from time to time when deleting or moving objects.
I looked up the docs for cloud storage and came across this:
At the bottom of the page there is also:
It would be super useful and in my opinion necessary to include retrying functionality into the python-storage library.
It was also a bit confusing to me that the official docs are recommending a third party retrying library while already having google.api.core.retry inside google cloud python sdk.
The
google.api.core.retry
docs mention this:A potential solution could be to add the retry parameter of type
google.api.core.retry
to public api functions or while creating the storage client. Developers could specify if they want exponential backoff and set the parameters as they like (maximum delay, deadline, etc.)Java and Node.js libraries already have similar functionality and it seems that this can be useful when cloud storage is facing some issues or high traffic load, or in the case where users are hitting the limits of 1k writes or 5k reads per second and GCS is autoscaling.
If this is already supported in the library, please direct me to some docs, I wasn't able to find it...
The text was updated successfully, but these errors were encountered: