Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update databricks-sdk requirement from ~=0.10.0 to ~=0.11.0 #448

Merged
merged 1 commit into from
Oct 12, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Oct 12, 2023

Updates the requirements on databricks-sdk to permit the latest version.

Release notes

Sourced from databricks-sdk's releases.

v0.11.0

  • Added Python 3.12 to project classifiers (#381).
  • Fix serialization issues for generated resources (#382).
  • Fix select spark version in staging (#388).
  • Adjust token expiry window to 40 seconds because of Azure (#392).
  • Add retries on RPC token bucket limit has been exceeded (#395).
  • Regenerate to fix template drift (#398).
  • Update OpenAPI spec to 12 Oct 2023 (#399).

Internal:

  • GitHub OIDC publishing (#386).
  • Move Release Pipeline to OIDC (#387).

API Changes:

  • Changed download() method for a.billable_usage account-level service to start returning databricks.sdk.service.billing.DownloadResponse dataclass.
  • Added databricks.sdk.service.billing.DownloadResponse dataclass.
  • Changed delete() method for a.account_storage_credentials account-level service with new required argument order.
  • Changed get() method for a.account_storage_credentials account-level service with new required argument order.
  • Changed update() method for a.account_storage_credentials account-level service with new required argument order.
  • Added get_bindings() method for w.workspace_bindings workspace-level service.
  • Added update_bindings() method for w.workspace_bindings workspace-level service.
  • Removed name field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Added storage_credential_name field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Removed name field for databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest.
  • Added storage_credential_name field for databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest.
  • Removed name field for databricks.sdk.service.catalog.GetAccountStorageCredentialRequest.
  • Added storage_credential_name field for databricks.sdk.service.catalog.GetAccountStorageCredentialRequest.
  • Added owner field for databricks.sdk.service.catalog.UpdateConnection.
  • Added databricks.sdk.service.catalog.GetBindingsRequest dataclass.
  • Added databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBinding dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBindingBindingType dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBindingsResponse dataclass.
  • Added spec field for databricks.sdk.service.compute.ClusterDetails.
  • Added apply_policy_default_values field for databricks.sdk.service.compute.ClusterSpec.
  • Removed aws_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed azure_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed disk_spec field for databricks.sdk.service.compute.EditInstancePool.
  • Removed enable_elastic_disk field for databricks.sdk.service.compute.EditInstancePool.
  • Removed gcp_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed preloaded_docker_images field for databricks.sdk.service.compute.EditInstancePool.
  • Removed preloaded_spark_versions field for databricks.sdk.service.compute.EditInstancePool.
  • Added deployment field for databricks.sdk.service.jobs.CreateJob.
  • Added ui_state field for databricks.sdk.service.jobs.CreateJob.
  • Added deployment field for databricks.sdk.service.jobs.JobSettings.
  • Added ui_state field for databricks.sdk.service.jobs.JobSettings.
  • Removed condition_task field for databricks.sdk.service.jobs.RunOutput.

... (truncated)

Changelog

Sourced from databricks-sdk's changelog.

0.11.0

  • Added Python 3.12 to project classifiers (#381).
  • Fix serialization issues for generated resources (#382).
  • Fix select spark version in staging (#388).
  • Adjust token expiry window to 40 seconds because of Azure (#392).
  • Add retries on RPC token bucket limit has been exceeded (#395).
  • Regenerate to fix template drift (#398).
  • Update OpenAPI spec to 12 Oct 2023 (#399).

Internal:

  • GitHub OIDC publishing (#386).
  • Move Release Pipeline to OIDC (#387).

API Changes:

  • Changed download() method for a.billable_usage account-level service to start returning databricks.sdk.service.billing.DownloadResponse dataclass.
  • Added databricks.sdk.service.billing.DownloadResponse dataclass.
  • Changed delete() method for a.account_storage_credentials account-level service with new required argument order.
  • Changed get() method for a.account_storage_credentials account-level service with new required argument order.
  • Changed update() method for a.account_storage_credentials account-level service with new required argument order.
  • Added get_bindings() method for w.workspace_bindings workspace-level service.
  • Added update_bindings() method for w.workspace_bindings workspace-level service.
  • Removed name field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Added storage_credential_name field for databricks.sdk.service.catalog.AccountsUpdateStorageCredential.
  • Removed name field for databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest.
  • Added storage_credential_name field for databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest.
  • Removed name field for databricks.sdk.service.catalog.GetAccountStorageCredentialRequest.
  • Added storage_credential_name field for databricks.sdk.service.catalog.GetAccountStorageCredentialRequest.
  • Added owner field for databricks.sdk.service.catalog.UpdateConnection.
  • Added databricks.sdk.service.catalog.GetBindingsRequest dataclass.
  • Added databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBinding dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBindingBindingType dataclass.
  • Added databricks.sdk.service.catalog.WorkspaceBindingsResponse dataclass.
  • Added spec field for databricks.sdk.service.compute.ClusterDetails.
  • Added apply_policy_default_values field for databricks.sdk.service.compute.ClusterSpec.
  • Removed aws_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed azure_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed disk_spec field for databricks.sdk.service.compute.EditInstancePool.
  • Removed enable_elastic_disk field for databricks.sdk.service.compute.EditInstancePool.
  • Removed gcp_attributes field for databricks.sdk.service.compute.EditInstancePool.
  • Removed preloaded_docker_images field for databricks.sdk.service.compute.EditInstancePool.
  • Removed preloaded_spark_versions field for databricks.sdk.service.compute.EditInstancePool.
  • Added deployment field for databricks.sdk.service.jobs.CreateJob.
  • Added ui_state field for databricks.sdk.service.jobs.CreateJob.
  • Added deployment field for databricks.sdk.service.jobs.JobSettings.
  • Added ui_state field for databricks.sdk.service.jobs.JobSettings.
  • Removed condition_task field for databricks.sdk.service.jobs.RunOutput.

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.10.0...v0.11.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot requested a review from a team October 12, 2023 15:12
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Oct 12, 2023
@codecov
Copy link

codecov bot commented Oct 12, 2023

Codecov Report

Merging #448 (4e76add) into main (1d6377e) will not change coverage.
The diff coverage is n/a.

@@           Coverage Diff           @@
##             main     #448   +/-   ##
=======================================
  Coverage   81.23%   81.23%           
=======================================
  Files          31       31           
  Lines        3027     3027           
  Branches      576      576           
=======================================
  Hits         2459     2459           
  Misses        443      443           
  Partials      125      125           

@nfx nfx merged commit 68db617 into main Oct 12, 2023
5 checks passed
@nfx nfx deleted the dependabot/pip/databricks-sdk-approx-eq-0.11.0 branch October 12, 2023 16:25
@nfx nfx mentioned this pull request Oct 18, 2023
FastLee pushed a commit that referenced this pull request Oct 25, 2023
Updates the requirements on
[databricks-sdk](https://github.com/databricks/databricks-sdk-py) to
permit the latest version.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/releases">databricks-sdk's
releases</a>.</em></p>
<blockquote>
<h2>v0.11.0</h2>
<ul>
<li>Added Python 3.12 to project classifiers (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/381">#381</a>).</li>
<li>Fix serialization issues for generated resources (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/382">#382</a>).</li>
<li>Fix select spark version in staging (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/388">#388</a>).</li>
<li>Adjust token expiry window to 40 seconds because of Azure (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/392">#392</a>).</li>
<li>Add retries on <code>RPC token bucket limit has been exceeded</code>
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/395">#395</a>).</li>
<li>Regenerate to fix template drift (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/398">#398</a>).</li>
<li>Update OpenAPI spec to 12 Oct 2023 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/399">#399</a>).</li>
</ul>
<p>Internal:</p>
<ul>
<li>GitHub OIDC publishing (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/386">#386</a>).</li>
<li>Move Release Pipeline to OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/387">#387</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>download()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/billable_usage.html">a.billable_usage</a>
account-level service to start returning
<code>databricks.sdk.service.billing.DownloadResponse</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.billing.DownloadResponse</code>
dataclass.</li>
<li>Changed <code>delete()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Changed <code>get()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Changed <code>update()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Added <code>get_bindings()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a>
workspace-level service.</li>
<li>Added <code>update_bindings()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a>
workspace-level service.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li>
<li>Added <code>owner</code> field for
<code>databricks.sdk.service.catalog.UpdateConnection</code>.</li>
<li>Added <code>databricks.sdk.service.catalog.GetBindingsRequest</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.catalog.WorkspaceBinding</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.WorkspaceBindingBindingType</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.WorkspaceBindingsResponse</code>
dataclass.</li>
<li>Added <code>spec</code> field for
<code>databricks.sdk.service.compute.ClusterDetails</code>.</li>
<li>Added <code>apply_policy_default_values</code> field for
<code>databricks.sdk.service.compute.ClusterSpec</code>.</li>
<li>Removed <code>aws_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>azure_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>disk_spec</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>enable_elastic_disk</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>gcp_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>preloaded_docker_images</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>preloaded_spark_versions</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Added <code>deployment</code> field for
<code>databricks.sdk.service.jobs.CreateJob</code>.</li>
<li>Added <code>ui_state</code> field for
<code>databricks.sdk.service.jobs.CreateJob</code>.</li>
<li>Added <code>deployment</code> field for
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Added <code>ui_state</code> field for
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Removed <code>condition_task</code> field for
<code>databricks.sdk.service.jobs.RunOutput</code>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md">databricks-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>0.11.0</h2>
<ul>
<li>Added Python 3.12 to project classifiers (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/381">#381</a>).</li>
<li>Fix serialization issues for generated resources (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/382">#382</a>).</li>
<li>Fix select spark version in staging (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/388">#388</a>).</li>
<li>Adjust token expiry window to 40 seconds because of Azure (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/392">#392</a>).</li>
<li>Add retries on <code>RPC token bucket limit has been exceeded</code>
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/395">#395</a>).</li>
<li>Regenerate to fix template drift (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/398">#398</a>).</li>
<li>Update OpenAPI spec to 12 Oct 2023 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/399">#399</a>).</li>
</ul>
<p>Internal:</p>
<ul>
<li>GitHub OIDC publishing (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/386">#386</a>).</li>
<li>Move Release Pipeline to OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/387">#387</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Changed <code>download()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/billable_usage.html">a.billable_usage</a>
account-level service to start returning
<code>databricks.sdk.service.billing.DownloadResponse</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.billing.DownloadResponse</code>
dataclass.</li>
<li>Changed <code>delete()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Changed <code>get()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Changed <code>update()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/account/account_storage_credentials.html">a.account_storage_credentials</a>
account-level service with new required argument order.</li>
<li>Added <code>get_bindings()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a>
workspace-level service.</li>
<li>Added <code>update_bindings()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/workspace_bindings.html">w.workspace_bindings</a>
workspace-level service.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.AccountsUpdateStorageCredential</code>.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.DeleteAccountStorageCredentialRequest</code>.</li>
<li>Removed <code>name</code> field for
<code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li>
<li>Added <code>storage_credential_name</code> field for
<code>databricks.sdk.service.catalog.GetAccountStorageCredentialRequest</code>.</li>
<li>Added <code>owner</code> field for
<code>databricks.sdk.service.catalog.UpdateConnection</code>.</li>
<li>Added <code>databricks.sdk.service.catalog.GetBindingsRequest</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.UpdateWorkspaceBindingsParameters</code>
dataclass.</li>
<li>Added <code>databricks.sdk.service.catalog.WorkspaceBinding</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.WorkspaceBindingBindingType</code>
dataclass.</li>
<li>Added
<code>databricks.sdk.service.catalog.WorkspaceBindingsResponse</code>
dataclass.</li>
<li>Added <code>spec</code> field for
<code>databricks.sdk.service.compute.ClusterDetails</code>.</li>
<li>Added <code>apply_policy_default_values</code> field for
<code>databricks.sdk.service.compute.ClusterSpec</code>.</li>
<li>Removed <code>aws_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>azure_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>disk_spec</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>enable_elastic_disk</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>gcp_attributes</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>preloaded_docker_images</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Removed <code>preloaded_spark_versions</code> field for
<code>databricks.sdk.service.compute.EditInstancePool</code>.</li>
<li>Added <code>deployment</code> field for
<code>databricks.sdk.service.jobs.CreateJob</code>.</li>
<li>Added <code>ui_state</code> field for
<code>databricks.sdk.service.jobs.CreateJob</code>.</li>
<li>Added <code>deployment</code> field for
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Added <code>ui_state</code> field for
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Removed <code>condition_task</code> field for
<code>databricks.sdk.service.jobs.RunOutput</code>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/18e588ff852480751276e46ba2dcaa8c2d7f1eb5"><code>18e588f</code></a>
Add contents: write permission to release workflow (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/403">#403</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/72c09d10402e5f6729e4fe555fd48a633780a854"><code>72c09d1</code></a>
Release v0.11.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/401">#401</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/46ffcae8b017ec4d7cc94bf0ec60314442fc4a39"><code>46ffcae</code></a>
Update OpenAPI spec to 12 Oct 2023 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/399">#399</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/8ed377d8e6448806ee70fa36db556e396878f084"><code>8ed377d</code></a>
Regenerate to fix template drift (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/398">#398</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/afeff4977e5eaaa1ee6193b2b2d344572520fecb"><code>afeff49</code></a>
Add retries on <code>RPC token bucket limit has been exceeded</code> (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/395">#395</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/26e41fe282eb8d4e23e968efcfa61077baee148f"><code>26e41fe</code></a>
Adjust token expiry window to 40 seconds because of Azure (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/392">#392</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/4fb5e38b2b84e52aa5d39a31f78aa6380f94d5a9"><code>4fb5e38</code></a>
Fix select spark version in staging (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/388">#388</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/bb3095b510e0bbe6d74a4fe47fc1ee1803b00612"><code>bb3095b</code></a>
Move Release Pipeline to OIDC (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/387">#387</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/16f7bc01bb20edd08163543321e470e58af22f3f"><code>16f7bc0</code></a>
[WIP] GitHub OIDC publishing (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/386">#386</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/b0ad5d1cfc3945157f1882ee0cd34abff1d475c9"><code>b0ad5d1</code></a>
Fix serialization issues for generated resources (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/382">#382</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databricks/databricks-sdk-py/compare/v0.10.0...v0.11.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant