BigQuery: add create_job
method that takes any kind of job config (and public configuration property to job classes)
#14
Labels
api: bigquery
Issues related to the googleapis/python-bigquery API.
type: feature request
‘Nice-to-have’ improvement, new feature or different behavior or design.
Is your feature request related to a problem? Please describe.
Some job failures are just due to current conditions and the job will succeed if started from the beginning. The problem is that this isn't a simple "retry" because the job configuration is mutable.
For example, BigQuery automatically populates a destination table if not set. In this case, the destination table should not be part of the retried request. In my opinion, this kind of logic is outside the scope of the client libraries, as it's not clear when destination table needs to be cleared just from the job resource.
Describe the solution you'd like
create_job
method that takes any JobConfig object.configuration
property to job classes.403 rateLimitExceeded
(possibly with resetting the destination table if it was a query job)Describe alternatives you've considered
Add
.retry()
method to job classes. This is problematic, primarily because the configuration may have changed since the job was initially created, leading to unintended consequences or hard-to-debug failures.Additional context
Add any other context or screenshots about the feature request here.
See customer request for a method to retry any job at googleapis/google-cloud-python#5555
The text was updated successfully, but these errors were encountered: