Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update databricks-sdk requirement from ~=0.12.0 to ~=0.13.0 #575

Merged
merged 1 commit into from
Nov 14, 2023

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 14, 2023

Updates the requirements on databricks-sdk to permit the latest version.

Release notes

Sourced from databricks-sdk's releases.

v0.13.0

  • Introduce more specific exceptions, like NotFound, AlreadyExists, BadRequest, PermissionDenied, InternalError, and others (#376). This makes it easier to handle errors thrown by the Databricks API. Instead of catching DatabricksError and checking the error_code field, you can catch one of these subtypes of DatabricksError, which is more ergonomic and removes the need to rethrow exceptions that you don't want to catch. For example:
try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except DatabricksError as e:
  if e.error_code in [
    "RESOURCE_DOES_NOT_EXIST",
    "RESOURCE_NOT_FOUND",
    "PERMISSION_DENIED",
    "FEATURE_DISABLED",
    "BAD_REQUEST"]:
    logger.warning(...)
    return None
  raise RetryableError(...) from e

can be replaced with

try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except PermissionDenied, FeatureDisabled:
  logger.warning(...)
  return None
except NotFound:
  raise RetryableError(...)
  • Paginate all SCIM list requests in the SDK (#440). This change ensures that SCIM list() APIs use a default limit of 100 resources, leveraging SCIM's offset + limit pagination to batch requests to the Databricks API.
  • Added taskValues support in remoteDbUtils (#406).
  • Added more detailed error message on default credentials not found error (#419).
  • Request management token via Azure CLI only for Service Principals and not human users (#408).

API Changes:

  • Fixed create() method for w.functions workspace-level service and corresponding databricks.sdk.service.catalog.CreateFunction and databricks.sdk.service.catalog.FunctionInfo dataclasses.
  • Changed create() method for w.metastores workspace-level service with new required argument order.
  • Changed storage_root field for databricks.sdk.service.catalog.CreateMetastore to be optional.
  • Added skip_validation field for databricks.sdk.service.catalog.UpdateExternalLocation.
  • Added libraries field for databricks.sdk.service.compute.CreatePolicy, databricks.sdk.service.compute.EditPolicy and databricks.sdk.service.compute.Policy.
  • Added init_scripts field for databricks.sdk.service.compute.EventDetails.
  • Added file field for databricks.sdk.service.compute.InitScriptInfo.
  • Added zone_id field for databricks.sdk.service.compute.InstancePoolGcpAttributes.
  • Added several dataclasses related to init scripts.
  • Added databricks.sdk.service.compute.LocalFileInfo dataclass.
  • Replaced ui_state field with edit_mode for databricks.sdk.service.jobs.CreateJob and databricks.sdk.service.jobs.JobSettings.
  • Replaced databricks.sdk.service.jobs.CreateJobUiState dataclass with databricks.sdk.service.jobs.CreateJobEditMode.
  • Added include_resolved_values field for databricks.sdk.service.jobs.GetRunRequest.

... (truncated)

Changelog

Sourced from databricks-sdk's changelog.

0.13.0

  • Introduce more specific exceptions, like NotFound, AlreadyExists, BadRequest, PermissionDenied, InternalError, and others (#376). This makes it easier to handle errors thrown by the Databricks API. Instead of catching DatabricksError and checking the error_code field, you can catch one of these subtypes of DatabricksError, which is more ergonomic and removes the need to rethrow exceptions that you don't want to catch. For example:
try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except DatabricksError as e:
  if e.error_code in [
    "RESOURCE_DOES_NOT_EXIST",
    "RESOURCE_NOT_FOUND",
    "PERMISSION_DENIED",
    "FEATURE_DISABLED",
    "BAD_REQUEST"]:
    logger.warning(...)
    return None
  raise RetryableError(...) from e

can be replaced with

try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except PermissionDenied, FeatureDisabled:
  logger.warning(...)
  return None
except NotFound:
  raise RetryableError(...)
  • Paginate all SCIM list requests in the SDK (#440). This change ensures that SCIM list() APIs use a default limit of 100 resources, leveraging SCIM's offset + limit pagination to batch requests to the Databricks API.
  • Added taskValues support in remoteDbUtils (#406).
  • Added more detailed error message on default credentials not found error (#419).
  • Request management token via Azure CLI only for Service Principals and not human users (#408).

API Changes:

  • Fixed create() method for w.functions workspace-level service and corresponding databricks.sdk.service.catalog.CreateFunction and databricks.sdk.service.catalog.FunctionInfo dataclasses.
  • Changed create() method for w.metastores workspace-level service with new required argument order.
  • Changed storage_root field for databricks.sdk.service.catalog.CreateMetastore to be optional.
  • Added skip_validation field for databricks.sdk.service.catalog.UpdateExternalLocation.
  • Added libraries field for databricks.sdk.service.compute.CreatePolicy, databricks.sdk.service.compute.EditPolicy and databricks.sdk.service.compute.Policy.
  • Added init_scripts field for databricks.sdk.service.compute.EventDetails.
  • Added file field for databricks.sdk.service.compute.InitScriptInfo.
  • Added zone_id field for databricks.sdk.service.compute.InstancePoolGcpAttributes.
  • Added several dataclasses related to init scripts.
  • Added databricks.sdk.service.compute.LocalFileInfo dataclass.
  • Replaced ui_state field with edit_mode for databricks.sdk.service.jobs.CreateJob and databricks.sdk.service.jobs.JobSettings.
  • Replaced databricks.sdk.service.jobs.CreateJobUiState dataclass with databricks.sdk.service.jobs.CreateJobEditMode.

... (truncated)

Commits
  • 86ca043 Release v0.13.0 (#442)
  • 9ba48cc Paginate all SCIM list requests in the SDK (#440)
  • 50c71a1 Update OpenAPI spec for Python SDK (#439)
  • 69aa629 Regenerate Python SDK after bumping Deco tool (#438)
  • 93a622d Introduce more specific exceptions, like NotFound, AlreadyExists, `BadReq...
  • a862adc Request management token via Azure CLI only for Service Principals and not hu...
  • ee54c3c Make test_auth no longer fail if you have a default profile setup (#426)
  • 5bd7f6b Add more detailed error message on default credentials not found error (#419)
  • 9955c08 [JOBS-11439] Add taskValues support in remoteDbUtils (#406)
  • 2401940 Add regression question to issue template (#414)
  • See full diff in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [databricks-sdk](https://github.com/databricks/databricks-sdk-py) to permit the latest version.
- [Release notes](https://github.com/databricks/databricks-sdk-py/releases)
- [Changelog](https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md)
- [Commits](databricks/databricks-sdk-py@v0.12.0...v0.13.0)

---
updated-dependencies:
- dependency-name: databricks-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot requested a review from a team November 14, 2023 15:54
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Nov 14, 2023
@nfx nfx merged commit 5e1dcc8 into main Nov 14, 2023
3 of 4 checks passed
@nfx nfx deleted the dependabot/pip/databricks-sdk-approx-eq-0.13.0 branch November 14, 2023 18:16
nfx added a commit that referenced this pull request Nov 17, 2023
**Breaking changes** (existing installations need to reinstall UCX and re-run assessment jobs)

 * Switched local group migration component to rename groups instead of creating backup groups ([#450](#450)).
 * Mitigate permissions loss in Table ACLs by folding grants belonging to the same principal, object id and object type together ([#512](#512)).

**New features**

 * Added support for the experimental Databricks CLI launcher ([#517](#517)).
 * Added support for external Hive Metastores including AWS Glue ([#400](#400)).
 * Added more views to assessment dashboard ([#474](#474)).
 * Added rate limit for creating backup group to increase stability ([#500](#500)).
 * Added deduplication for mount point list ([#569](#569)).
 * Added documentation to describe interaction with external Hive Metastores ([#473](#473)).
 * Added failure injection for job failure message propagation  ([#591](#591)).
 * Added uniqueness in the new warehouse name to avoid conflicts on installation ([#542](#542)).
 * Added a global init script to collect Hive Metastore lineage ([#513](#513)).
 * Added retry set/update permissions when possible and assess the changes in the workspace ([#519](#519)).
 * Use `~/.ucx/state.json` to store the state of both dashboards and jobs ([#561](#561)).

**Bug fixes**

 * Fixed handling for `OWN` table permissions ([#571](#571)).
 * Fixed handling of keys with and without values. ([#514](#514)).
 * Fixed integration test failures related to concurrent group delete ([#584](#584)).
 * Fixed issue with workspace listing process on None type `object_type` ([#481](#481)).
 * Fixed missing group entitlement migration bug ([#583](#583)).
 * Fixed entitlement application for account-level groups ([#529](#529)).
 * Fixed assessment throwing an error when the owner of an object is empty ([#485](#485)).
 * Fixed installer to migrate between different configuration file versions ([#596](#596)).
 * Fixed cluster policy crawler to be aware of deleted policies ([#486](#486)).
 * Improved error message for not null constraints violated ([#532](#532)).
 * Improved integration test resiliency ([#597](#597), [#594](#594), [#586](#586)).
 * Introduced Safer access to workspace objects' properties. ([#530](#530)).
 * Mitigated permissions loss in Table ACLs by running appliers with single thread ([#518](#518)).
 * Running apply permission task before assessment should display message ([#487](#487)).
 * Split integration tests from blocking the merge queue ([#496](#496)).
 * Support more than one dashboard per step ([#472](#472)).
 * Update databricks-sdk requirement from ~=0.11.0 to ~=0.12.0 ([#505](#505)).
 * Update databricks-sdk requirement from ~=0.12.0 to ~=0.13.0 ([#575](#575)).
@nfx nfx mentioned this pull request Nov 17, 2023
nfx added a commit that referenced this pull request Nov 17, 2023
**Breaking changes** (existing installations need to reinstall UCX and
re-run assessment jobs)

* Switched local group migration component to rename groups instead of
creating backup groups
([#450](#450)).
* Mitigate permissions loss in Table ACLs by folding grants belonging to
the same principal, object id and object type together
([#512](#512)).

**New features**

* Added support for the experimental Databricks CLI launcher
([#517](#517)).
* Added support for external Hive Metastores including AWS Glue
([#400](#400)).
* Added more views to assessment dashboard
([#474](#474)).
* Added rate limit for creating backup group to increase stability
([#500](#500)).
* Added deduplication for mount point list
([#569](#569)).
* Added documentation to describe interaction with external Hive
Metastores ([#473](#473)).
* Added failure injection for job failure message propagation
([#591](#591)).
* Added uniqueness in the new warehouse name to avoid conflicts on
installation ([#542](#542)).
* Added a global init script to collect Hive Metastore lineage
([#513](#513)).
* Added retry set/update permissions when possible and assess the
changes in the workspace
([#519](#519)).
* Use `~/.ucx/state.json` to store the state of both dashboards and jobs
([#561](#561)).

**Bug fixes**

* Fixed handling for `OWN` table permissions
([#571](#571)).
* Fixed handling of keys with and without values.
([#514](#514)).
* Fixed integration test failures related to concurrent group delete
([#584](#584)).
* Fixed issue with workspace listing process on None type `object_type`
([#481](#481)).
* Fixed missing group entitlement migration bug
([#583](#583)).
* Fixed entitlement application for account-level groups
([#529](#529)).
* Fixed assessment throwing an error when the owner of an object is
empty ([#485](#485)).
* Fixed installer to migrate between different configuration file
versions ([#596](#596)).
* Fixed cluster policy crawler to be aware of deleted policies
([#486](#486)).
* Improved error message for not null constraints violated
([#532](#532)).
* Improved integration test resiliency
([#597](#597),
[#594](#594),
[#586](#586)).
* Introduced Safer access to workspace objects' properties.
([#530](#530)).
* Mitigated permissions loss in Table ACLs by running appliers with
single thread ([#518](#518)).
* Running apply permission task before assessment should display message
([#487](#487)).
* Split integration tests from blocking the merge queue
([#496](#496)).
* Support more than one dashboard per step
([#472](#472)).
* Update databricks-sdk requirement from ~=0.11.0 to ~=0.12.0
([#505](#505)).
* Update databricks-sdk requirement from ~=0.12.0 to ~=0.13.0
([#575](#575)).
pritishpai pushed a commit that referenced this pull request Nov 21, 2023
Updates the requirements on
[databricks-sdk](https://github.com/databricks/databricks-sdk-py) to
permit the latest version.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/releases">databricks-sdk's
releases</a>.</em></p>
<blockquote>
<h2>v0.13.0</h2>
<ul>
<li>Introduce more specific exceptions, like <code>NotFound</code>,
<code>AlreadyExists</code>, <code>BadRequest</code>,
<code>PermissionDenied</code>, <code>InternalError</code>, and others
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/376">#376</a>).
This makes it easier to handle errors thrown by the Databricks API.
Instead of catching <code>DatabricksError</code> and checking the
error_code field, you can catch one of these subtypes of
<code>DatabricksError</code>, which is more ergonomic and removes the
need to rethrow exceptions that you don't want to catch. For
example:</li>
</ul>
<pre lang="python"><code>try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except DatabricksError as e:
  if e.error_code in [
    &quot;RESOURCE_DOES_NOT_EXIST&quot;,
    &quot;RESOURCE_NOT_FOUND&quot;,
    &quot;PERMISSION_DENIED&quot;,
    &quot;FEATURE_DISABLED&quot;,
    &quot;BAD_REQUEST&quot;]:
    logger.warning(...)
    return None
  raise RetryableError(...) from e
</code></pre>
<p>can be replaced with</p>
<pre lang="python"><code>try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except PermissionDenied, FeatureDisabled:
  logger.warning(...)
  return None
except NotFound:
  raise RetryableError(...)
</code></pre>
<ul>
<li>Paginate all SCIM list requests in the SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/440">#440</a>).
This change ensures that SCIM list() APIs use a default limit of 100
resources, leveraging SCIM's offset + limit pagination to batch requests
to the Databricks API.</li>
<li>Added taskValues support in remoteDbUtils (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/406">#406</a>).</li>
<li>Added more detailed error message on default credentials not found
error (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/419">#419</a>).</li>
<li>Request management token via Azure CLI only for Service Principals
and not human users (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/408">#408</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Fixed <code>create()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/functions.html">w.functions</a>
workspace-level service and corresponding
<code>databricks.sdk.service.catalog.CreateFunction</code> and
<code>databricks.sdk.service.catalog.FunctionInfo</code>
dataclasses.</li>
<li>Changed <code>create()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html">w.metastores</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>storage_root</code> field for
<code>databricks.sdk.service.catalog.CreateMetastore</code> to be
optional.</li>
<li>Added <code>skip_validation</code> field for
<code>databricks.sdk.service.catalog.UpdateExternalLocation</code>.</li>
<li>Added <code>libraries</code> field for
<code>databricks.sdk.service.compute.CreatePolicy</code>,
<code>databricks.sdk.service.compute.EditPolicy</code> and
<code>databricks.sdk.service.compute.Policy</code>.</li>
<li>Added <code>init_scripts</code> field for
<code>databricks.sdk.service.compute.EventDetails</code>.</li>
<li>Added <code>file</code> field for
<code>databricks.sdk.service.compute.InitScriptInfo</code>.</li>
<li>Added <code>zone_id</code> field for
<code>databricks.sdk.service.compute.InstancePoolGcpAttributes</code>.</li>
<li>Added several dataclasses related to init scripts.</li>
<li>Added <code>databricks.sdk.service.compute.LocalFileInfo</code>
dataclass.</li>
<li>Replaced <code>ui_state</code> field with <code>edit_mode</code> for
<code>databricks.sdk.service.jobs.CreateJob</code> and
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Replaced <code>databricks.sdk.service.jobs.CreateJobUiState</code>
dataclass with
<code>databricks.sdk.service.jobs.CreateJobEditMode</code>.</li>
<li>Added <code>include_resolved_values</code> field for
<code>databricks.sdk.service.jobs.GetRunRequest</code>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databricks/databricks-sdk-py/blob/main/CHANGELOG.md">databricks-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>0.13.0</h2>
<ul>
<li>Introduce more specific exceptions, like <code>NotFound</code>,
<code>AlreadyExists</code>, <code>BadRequest</code>,
<code>PermissionDenied</code>, <code>InternalError</code>, and others
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/376">#376</a>).
This makes it easier to handle errors thrown by the Databricks API.
Instead of catching <code>DatabricksError</code> and checking the
error_code field, you can catch one of these subtypes of
<code>DatabricksError</code>, which is more ergonomic and removes the
need to rethrow exceptions that you don't want to catch. For
example:</li>
</ul>
<pre lang="python"><code>try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except DatabricksError as e:
  if e.error_code in [
    &quot;RESOURCE_DOES_NOT_EXIST&quot;,
    &quot;RESOURCE_NOT_FOUND&quot;,
    &quot;PERMISSION_DENIED&quot;,
    &quot;FEATURE_DISABLED&quot;,
    &quot;BAD_REQUEST&quot;]:
    logger.warning(...)
    return None
  raise RetryableError(...) from e
</code></pre>
<p>can be replaced with</p>
<pre lang="python"><code>try:
  return (self._ws
    .permissions
    .get(object_type, object_id))
except PermissionDenied, FeatureDisabled:
  logger.warning(...)
  return None
except NotFound:
  raise RetryableError(...)
</code></pre>
<ul>
<li>Paginate all SCIM list requests in the SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/440">#440</a>).
This change ensures that SCIM list() APIs use a default limit of 100
resources, leveraging SCIM's offset + limit pagination to batch requests
to the Databricks API.</li>
<li>Added taskValues support in remoteDbUtils (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/406">#406</a>).</li>
<li>Added more detailed error message on default credentials not found
error (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/419">#419</a>).</li>
<li>Request management token via Azure CLI only for Service Principals
and not human users (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/pull/408">#408</a>).</li>
</ul>
<p>API Changes:</p>
<ul>
<li>Fixed <code>create()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/functions.html">w.functions</a>
workspace-level service and corresponding
<code>databricks.sdk.service.catalog.CreateFunction</code> and
<code>databricks.sdk.service.catalog.FunctionInfo</code>
dataclasses.</li>
<li>Changed <code>create()</code> method for <a
href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/metastores.html">w.metastores</a>
workspace-level service with new required argument order.</li>
<li>Changed <code>storage_root</code> field for
<code>databricks.sdk.service.catalog.CreateMetastore</code> to be
optional.</li>
<li>Added <code>skip_validation</code> field for
<code>databricks.sdk.service.catalog.UpdateExternalLocation</code>.</li>
<li>Added <code>libraries</code> field for
<code>databricks.sdk.service.compute.CreatePolicy</code>,
<code>databricks.sdk.service.compute.EditPolicy</code> and
<code>databricks.sdk.service.compute.Policy</code>.</li>
<li>Added <code>init_scripts</code> field for
<code>databricks.sdk.service.compute.EventDetails</code>.</li>
<li>Added <code>file</code> field for
<code>databricks.sdk.service.compute.InitScriptInfo</code>.</li>
<li>Added <code>zone_id</code> field for
<code>databricks.sdk.service.compute.InstancePoolGcpAttributes</code>.</li>
<li>Added several dataclasses related to init scripts.</li>
<li>Added <code>databricks.sdk.service.compute.LocalFileInfo</code>
dataclass.</li>
<li>Replaced <code>ui_state</code> field with <code>edit_mode</code> for
<code>databricks.sdk.service.jobs.CreateJob</code> and
<code>databricks.sdk.service.jobs.JobSettings</code>.</li>
<li>Replaced <code>databricks.sdk.service.jobs.CreateJobUiState</code>
dataclass with
<code>databricks.sdk.service.jobs.CreateJobEditMode</code>.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/86ca043ff71b3cc2e8c795c40c4ae584c65b919f"><code>86ca043</code></a>
Release v0.13.0 (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/442">#442</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/9ba48cc7661c4ab306dd866bcbf88b8733bed9b0"><code>9ba48cc</code></a>
Paginate all SCIM list requests in the SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/440">#440</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/50c71a1d14ab4cbeb43b9bde78356f67efdac0fd"><code>50c71a1</code></a>
Update OpenAPI spec for Python SDK (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/439">#439</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/69aa6299b024435c2670cee6452dc4ff04c0d679"><code>69aa629</code></a>
Regenerate Python SDK after bumping Deco tool (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/438">#438</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/93a622d93c404d75ee296993560c9321f4163a0d"><code>93a622d</code></a>
Introduce more specific exceptions, like <code>NotFound</code>,
<code>AlreadyExists</code>, `BadReq...</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/a862adc409e0be24f4c207737b5d5055a60d317d"><code>a862adc</code></a>
Request management token via Azure CLI only for Service Principals and
not hu...</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/ee54c3c39c6e2f9dc46f7cf2a4cf8db847ff2124"><code>ee54c3c</code></a>
Make test_auth no longer fail if you have a default profile setup (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/426">#426</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/5bd7f6bff2af3a6a53e6d1437b96d6bba7cd7f0e"><code>5bd7f6b</code></a>
Add more detailed error message on default credentials not found error
(<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/419">#419</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/9955c08537f1e36b6c62617a1e35c54d8706d5d3"><code>9955c08</code></a>
[JOBS-11439] Add taskValues support in remoteDbUtils (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/406">#406</a>)</li>
<li><a
href="https://github.com/databricks/databricks-sdk-py/commit/240194033ad54ed63cd0e38b8a94dbf3d826c23b"><code>2401940</code></a>
Add regression question to issue template (<a
href="https://redirect.github.com/databricks/databricks-sdk-py/issues/414">#414</a>)</li>
<li>See full diff in <a
href="https://github.com/databricks/databricks-sdk-py/compare/v0.12.0...v0.13.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
pritishpai pushed a commit that referenced this pull request Nov 21, 2023
**Breaking changes** (existing installations need to reinstall UCX and
re-run assessment jobs)

* Switched local group migration component to rename groups instead of
creating backup groups
([#450](#450)).
* Mitigate permissions loss in Table ACLs by folding grants belonging to
the same principal, object id and object type together
([#512](#512)).

**New features**

* Added support for the experimental Databricks CLI launcher
([#517](#517)).
* Added support for external Hive Metastores including AWS Glue
([#400](#400)).
* Added more views to assessment dashboard
([#474](#474)).
* Added rate limit for creating backup group to increase stability
([#500](#500)).
* Added deduplication for mount point list
([#569](#569)).
* Added documentation to describe interaction with external Hive
Metastores ([#473](#473)).
* Added failure injection for job failure message propagation
([#591](#591)).
* Added uniqueness in the new warehouse name to avoid conflicts on
installation ([#542](#542)).
* Added a global init script to collect Hive Metastore lineage
([#513](#513)).
* Added retry set/update permissions when possible and assess the
changes in the workspace
([#519](#519)).
* Use `~/.ucx/state.json` to store the state of both dashboards and jobs
([#561](#561)).

**Bug fixes**

* Fixed handling for `OWN` table permissions
([#571](#571)).
* Fixed handling of keys with and without values.
([#514](#514)).
* Fixed integration test failures related to concurrent group delete
([#584](#584)).
* Fixed issue with workspace listing process on None type `object_type`
([#481](#481)).
* Fixed missing group entitlement migration bug
([#583](#583)).
* Fixed entitlement application for account-level groups
([#529](#529)).
* Fixed assessment throwing an error when the owner of an object is
empty ([#485](#485)).
* Fixed installer to migrate between different configuration file
versions ([#596](#596)).
* Fixed cluster policy crawler to be aware of deleted policies
([#486](#486)).
* Improved error message for not null constraints violated
([#532](#532)).
* Improved integration test resiliency
([#597](#597),
[#594](#594),
[#586](#586)).
* Introduced Safer access to workspace objects' properties.
([#530](#530)).
* Mitigated permissions loss in Table ACLs by running appliers with
single thread ([#518](#518)).
* Running apply permission task before assessment should display message
([#487](#487)).
* Split integration tests from blocking the merge queue
([#496](#496)).
* Support more than one dashboard per step
([#472](#472)).
* Update databricks-sdk requirement from ~=0.11.0 to ~=0.12.0
([#505](#505)).
* Update databricks-sdk requirement from ~=0.12.0 to ~=0.13.0
([#575](#575)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant