Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add scheduleOptions to BigQuery Data Transfer. #2507

Closed
wants to merge 2 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions products/bigquerydatatransfer/api.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,34 @@ objects:
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: the granularity should be at least 8 hours, or less frequent.
- !ruby/object:Api::Type::NestedObject
name: 'scheduleOptions'
required: false
description: |
Options customizing the data transfer schedule.
properties:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We've recently started trying to prevent people from being able to write empty blocks, and since this nested object has three optional fields, this would allow that. We can mark the fields as exactly_one_of or at_least_one_of. Do either of those apply here?

Copy link
Contributor

@samouss samouss Oct 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would be interested to have this available in Terraform. I understand your point about the empty block. I think at_least_one_of could apply here. If you write the block the intent is to provide one of the options. The three of them have default values so it shouldn't be problematic i.e. startTime defaults to now(), endDate defaults to never and disableAutoScheduling defaults to false.

I'm more than ok to perform the change but I don't think I'm able to push on the PR. Any suggestions on how I can help to make this happen? I can close the branch and create a new PR if required. Let me know!

https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.transferConfigs#TransferConfig.ScheduleOptions

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR looks pretty abandoned, so I'm going to go ahead and close it out. Looks like someone else picked up the feature at #3895.

- !ruby/object:Api::Type::Boolean
name: 'disableAutoScheduling'
required: false
description: |
If true, automatic scheduling of data transfer runs for this configuration will be disabled.
The runs can be started on ad-hoc basis using StartManualTransferRuns API.
When automatic scheduling is disabled, the TransferConfig.schedule field will be ignored.
- !ruby/object:Api::Type::String
name: 'startTime'
required: false
description: |
Specifies time to start scheduling transfer runs.
The first run will be scheduled at or after the start time according to a recurrence pattern
defined in the schedule string. The start time can be changed at any moment.
The time when a data transfer can be trigerred manually is not limited by this option.
- !ruby/object:Api::Type::String
name: 'endTime'
description: |
Defines time to stop scheduling transfer runs. A transfer run cannot be scheduled at or after the end time.
The end time can be changed at any moment.
The time when a data transfer can be trigerred manually is not limited by this option.
required: false
- !ruby/object:Api::Type::Integer
name: 'dataRefreshWindowDays'
description: |
Expand Down