-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update databricks-sdk requirement from ~=0.10.0 to ~=0.11.0 #448
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version. - [Release notes](https://github.com/databricks/databricks-sdk-py/releases) - [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md) - [Commits](databricks/databricks-sdk-py@v0.10.0...v0.11.0) --- updated-dependencies: - dependency-name: databricks-sdk dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
dependabot
bot
added
the
dependencies
Pull requests that update a dependency file
label
Oct 12, 2023
Codecov Report
@@ Coverage Diff @@
## main #448 +/- ##
=======================================
Coverage 81.23% 81.23%
=======================================
Files 31 31
Lines 3027 3027
Branches 576 576
=======================================
Hits 2459 2459
Misses 443 443
Partials 125 125 |
nfx
approved these changes
Oct 12, 2023
Merged
FastLee
pushed a commit
that referenced
this pull request
Oct 25, 2023
Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-py/releases">databricks-sdk's releases</a>.</em></p> <blockquote> <h2>v0.11.0</h2> <ul> <li>Added Python 3.12 to project classifiers (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/381">#381</a>).</li> <li>Fix serialization issues for generated resources (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/382">#382</a>).</li> <li>Fix select spark version in staging (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/388">#388</a>).</li> <li>Adjust token expiry window to 40 seconds because of Azure (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/392">#392</a>).</li> <li>Add retries on <code>RPC token bucket limit has been exceeded</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/395">#395</a>).</li> <li>Regenerate to fix template drift (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/398">#398</a>).</li> <li>Update OpenAPI spec to 12 Oct 2023 (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/399">#399</a>).</li> </ul> <p>Internal:</p> <ul> <li>GitHub OIDC publishing (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/386">#386</a>).</li> <li>Move Release Pipeline to OIDC (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/387">#387</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>download()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/billable_usage.html">a.billable_usage</a> account-level service to start returning <code>databricks.sdk.service.billing.DownloadResponse</code> dataclass.</li> <li>Added <code>databricks.sdk.service.billing.DownloadResponse</code> dataclass.</li> <li>Changed <code>delete()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Changed <code>get()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Changed <code>update()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Added <code>get_bindings()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a> workspace-level service.</li> <li>Added <code>update_bindings()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a> workspace-level service.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li> <li>Added <code>owner</code> field for <code>databricks.sdk.service.catalog.UpdateConnection</code>.</li> <li>Added <code>databricks.sdk.service.catalog.GetBindingsRequest</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBinding</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBindingBindingType</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBindingsResponse</code> dataclass.</li> <li>Added <code>spec</code> field for <code>databricks.sdk.service.compute.ClusterDetails</code>.</li> <li>Added <code>apply_policy_default_values</code> field for <code>databricks.sdk.service.compute.ClusterSpec</code>.</li> <li>Removed <code>aws_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>azure_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>disk_spec</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>enable_elastic_disk</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>gcp_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>preloaded_docker_images</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>preloaded_spark_versions</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Added <code>deployment</code> field for <code>databricks.sdk.service.jobs.CreateJob</code>.</li> <li>Added <code>ui_state</code> field for <code>databricks.sdk.service.jobs.CreateJob</code>.</li> <li>Added <code>deployment</code> field for <code>databricks.sdk.service.jobs.JobSettings</code>.</li> <li>Added <code>ui_state</code> field for <code>databricks.sdk.service.jobs.JobSettings</code>.</li> <li>Removed <code>condition_task</code> field for <code>databricks.sdk.service.jobs.RunOutput</code>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md">databricks-sdk's changelog</a>.</em></p> <blockquote> <h2>0.11.0</h2> <ul> <li>Added Python 3.12 to project classifiers (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/381">#381</a>).</li> <li>Fix serialization issues for generated resources (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/382">#382</a>).</li> <li>Fix select spark version in staging (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/388">#388</a>).</li> <li>Adjust token expiry window to 40 seconds because of Azure (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/392">#392</a>).</li> <li>Add retries on <code>RPC token bucket limit has been exceeded</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/395">#395</a>).</li> <li>Regenerate to fix template drift (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/398">#398</a>).</li> <li>Update OpenAPI spec to 12 Oct 2023 (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/399">#399</a>).</li> </ul> <p>Internal:</p> <ul> <li>GitHub OIDC publishing (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/386">#386</a>).</li> <li>Move Release Pipeline to OIDC (<a href="https://redirect.github.com/databricks/databricks-sdk-py/pull/387">#387</a>).</li> </ul> <p>API Changes:</p> <ul> <li>Changed <code>download()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/billable_usage.html">a.billable_usage</a> account-level service to start returning <code>databricks.sdk.service.billing.DownloadResponse</code> dataclass.</li> <li>Added <code>databricks.sdk.service.billing.DownloadResponse</code> dataclass.</li> <li>Changed <code>delete()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Changed <code>get()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Changed <code>update()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a> account-level service with new required argument order.</li> <li>Added <code>get_bindings()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a> workspace-level service.</li> <li>Added <code>update_bindings()</code> method for <a href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a> workspace-level service.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li> <li>Removed <code>name</code> field for <code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li> <li>Added <code>storage_credential_name</code> field for <code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li> <li>Added <code>owner</code> field for <code>databricks.sdk.service.catalog.UpdateConnection</code>.</li> <li>Added <code>databricks.sdk.service.catalog.GetBindingsRequest</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBinding</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBindingBindingType</code> dataclass.</li> <li>Added <code>databricks.sdk.service.catalog.WorkspaceBindingsResponse</code> dataclass.</li> <li>Added <code>spec</code> field for <code>databricks.sdk.service.compute.ClusterDetails</code>.</li> <li>Added <code>apply_policy_default_values</code> field for <code>databricks.sdk.service.compute.ClusterSpec</code>.</li> <li>Removed <code>aws_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>azure_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>disk_spec</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>enable_elastic_disk</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>gcp_attributes</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>preloaded_docker_images</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Removed <code>preloaded_spark_versions</code> field for <code>databricks.sdk.service.compute.EditInstancePool</code>.</li> <li>Added <code>deployment</code> field for <code>databricks.sdk.service.jobs.CreateJob</code>.</li> <li>Added <code>ui_state</code> field for <code>databricks.sdk.service.jobs.CreateJob</code>.</li> <li>Added <code>deployment</code> field for <code>databricks.sdk.service.jobs.JobSettings</code>.</li> <li>Added <code>ui_state</code> field for <code>databricks.sdk.service.jobs.JobSettings</code>.</li> <li>Removed <code>condition_task</code> field for <code>databricks.sdk.service.jobs.RunOutput</code>.</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/18e588ff852480751276e46ba2dcaa8c2d7f1eb5"><code>18e588f</code></a> Add contents: write permission to release workflow (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/403">#403</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/72c09d10402e5f6729e4fe555fd48a633780a854"><code>72c09d1</code></a> Release v0.11.0 (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/401">#401</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/46ffcae8b017ec4d7cc94bf0ec60314442fc4a39"><code>46ffcae</code></a> Update OpenAPI spec to 12 Oct 2023 (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/399">#399</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/8ed377d8e6448806ee70fa36db556e396878f084"><code>8ed377d</code></a> Regenerate to fix template drift (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/398">#398</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/afeff4977e5eaaa1ee6193b2b2d344572520fecb"><code>afeff49</code></a> Add retries on <code>RPC token bucket limit has been exceeded</code> (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/395">#395</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/26e41fe282eb8d4e23e968efcfa61077baee148f"><code>26e41fe</code></a> Adjust token expiry window to 40 seconds because of Azure (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/392">#392</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/4fb5e38b2b84e52aa5d39a31f78aa6380f94d5a9"><code>4fb5e38</code></a> Fix select spark version in staging (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/388">#388</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/bb3095b510e0bbe6d74a4fe47fc1ee1803b00612"><code>bb3095b</code></a> Move Release Pipeline to OIDC (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/387">#387</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/16f7bc01bb20edd08163543321e470e58af22f3f"><code>16f7bc0</code></a> [WIP] GitHub OIDC publishing (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/386">#386</a>)</li> <li><a href="https://github.com/databricks/databricks-sdk-py/commit/b0ad5d1cfc3945157f1882ee0cd34abff1d475c9"><code>b0ad5d1</code></a> Fix serialization issues for generated resources (<a href="https://redirect.github.com/databricks/databricks-sdk-py/issues/382">#382</a>)</li> <li>Additional commits viewable in <a href="https://github.com/databricks/databricks-sdk-py/compare/v0.10.0...v0.11.0">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Updates the requirements on databricks-sdk to permit the latest version.
Release notes
Sourced from databricks-sdk's releases.
... (truncated)
Changelog
Sourced from databricks-sdk's changelog.
... (truncated)
Commits
18e588f
Add contents: write permission to release workflow (#403)72c09d1
Release v0.11.0 (#401)46ffcae
Update OpenAPI spec to 12 Oct 2023 (#399)8ed377d
Regenerate to fix template drift (#398)afeff49
Add retries onRPC token bucket limit has been exceeded
(#395)26e41fe
Adjust token expiry window to 40 seconds because of Azure (#392)4fb5e38
Fix select spark version in staging (#388)bb3095b
Move Release Pipeline to OIDC (#387)16f7bc0
[WIP] GitHub OIDC publishing (#386)b0ad5d1
Fix serialization issues for generated resources (#382)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditions
will show all of the ignore conditions of the specified dependency@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)