Skip to content

Commit

Permalink
Update databricks-labs-blueprint requirement from <0.6.0,>=0.4.3 to >…
Browse files Browse the repository at this point in the history
…=0.4.3,<0.7.0 (#1688)

Updates the requirements on
[databricks-labs-blueprint](https://github.com/databrickslabs/blueprint)
to permit the latest version.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/databrickslabs/blueprint/releases">databricks-labs-blueprint's
releases</a>.</em></p>
<blockquote>
<h2>v0.6.0</h2>
<ul>
<li>Added upstream wheel uploads for Databricks Workspaces without
Public Internet access (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/99">#99</a>).
This commit introduces a new feature for uploading upstream wheel
dependencies to Databricks Workspaces without Public Internet access. A
new flag has been added to upload functions, allowing users to include
or exclude dependencies in the download list. The <code>WheelsV2</code>
class has been updated with a new method,
<code>upload_wheel_dependencies(prefixes)</code>, which checks if each
wheel's name starts with any of the provided prefixes before uploading
it to the Workspace File System (WSFS). This feature also includes two
new tests to verify the functionality of uploading the main wheel
package and dependent wheel packages, optimizing downloads based on
specific use cases. This enables users to more easily use the package in
offline environments with restricted internet access, particularly for
Databricks Workspaces with extra layers of network security.</li>
<li>Fixed bug for double-uploading of unreleased wheels in air-gapped
setups (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/103">#103</a>).
In this release, we have addressed a bug in the
<code>upload_wheel_dependencies</code> method of the
<code>WheelsV2</code> class, which caused double-uploading of unreleased
wheels in air-gapped setups. This issue occurred due to the condition
<code>if wheel.name == self._local_wheel.name</code> not being met,
resulting in undefined behavior. We have introduced a cached property
<code>_current_version</code> to tackle this bug for unreleased versions
uploaded to air-gapped workspaces. We also added a new method,
<code>upload_to_wsfs()</code>, that uploads files to the workspace file
system (WSFS) in the integration test. This release also includes new
tests to ensure that only the Databricks SDK is uploaded and that the
number of installation files is correct. These changes have resolved the
double-uploading issue, and the number of installation files, Databricks
SDK, Blueprint, and version.json metadata are now uploaded correctly to
WSFS.</li>
</ul>
<p>Contributors: <a
href="https://github.com/aminmovahed-db"><code>@​aminmovahed-db</code></a>,
<a href="https://github.com/nfx"><code>@​nfx</code></a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/databrickslabs/blueprint/blob/main/CHANGELOG.md">databricks-labs-blueprint's
changelog</a>.</em></p>
<blockquote>
<h2>0.6.0</h2>
<ul>
<li>Added upstream wheel uploads for Databricks Workspaces without
Public Internet access (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/99">#99</a>).
This commit introduces a new feature for uploading upstream wheel
dependencies to Databricks Workspaces without Public Internet access. A
new flag has been added to upload functions, allowing users to include
or exclude dependencies in the download list. The <code>WheelsV2</code>
class has been updated with a new method,
<code>upload_wheel_dependencies(prefixes)</code>, which checks if each
wheel's name starts with any of the provided prefixes before uploading
it to the Workspace File System (WSFS). This feature also includes two
new tests to verify the functionality of uploading the main wheel
package and dependent wheel packages, optimizing downloads based on
specific use cases. This enables users to more easily use the package in
offline environments with restricted internet access, particularly for
Databricks Workspaces with extra layers of network security.</li>
<li>Fixed bug for double-uploading of unreleased wheels in air-gapped
setups (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/103">#103</a>).
In this release, we have addressed a bug in the
<code>upload_wheel_dependencies</code> method of the
<code>WheelsV2</code> class, which caused double-uploading of unreleased
wheels in air-gapped setups. This issue occurred due to the condition
<code>if wheel.name == self._local_wheel.name</code> not being met,
resulting in undefined behavior. We have introduced a cached property
<code>_current_version</code> to tackle this bug for unreleased versions
uploaded to air-gapped workspaces. We also added a new method,
<code>upload_to_wsfs()</code>, that uploads files to the workspace file
system (WSFS) in the integration test. This release also includes new
tests to ensure that only the Databricks SDK is uploaded and that the
number of installation files is correct. These changes have resolved the
double-uploading issue, and the number of installation files, Databricks
SDK, Blueprint, and version.json metadata are now uploaded correctly to
WSFS.</li>
</ul>
<h2>0.5.0</h2>
<ul>
<li>Added content assertion for <code>assert_file_uploaded</code> and
<code>assert_file_dbfs_uploaded</code> in <code>MockInstallation</code>
(<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/101">#101</a>).
The recent commit introduces a content assertion feature to the
<code>MockInstallation</code> class, enhancing its testing capabilities.
This is achieved by adding an optional <code>expected</code> parameter
of type <code>bytes</code> to the <code>assert_file_uploaded</code> and
<code>assert_file_dbfs_uploaded</code> methods, allowing users to verify
the uploaded content's correctness. The <code>_assert_upload</code>
method has also been updated to accept this new parameter, ensuring the
actual uploaded content matches the expected content. Furthermore, the
commit includes informative docstrings for the new and updated methods,
providing clear explanations of their functionality and usage. To
support these improvements, new test cases
<code>test_assert_file_uploaded</code> and
<code>test_load_empty_data_class</code> have been added to the
<code>tests/unit/test_installation.py</code> file, enabling more
rigorous testing of the <code>MockInstallation</code> class and ensuring
that the expected content is uploaded correctly.</li>
<li>Added handling for partial functions in
<code>parallel.Threads</code> (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/93">#93</a>).
In this release, we have enhanced the <code>parallel.Threads</code>
module with the ability to handle partial functions, addressing issue <a
href="https://redirect.github.com/databrickslabs/blueprint/issues/93">#93</a>.
This improvement includes the addition of a new static method,
<code>_get_result_function_signature</code>, to obtain the signature of
a function or a string representation of its arguments and keywords if
it is a partial function. The <code>_wrap_result</code> class method has
also been updated to log an error message with the function's signature
if an exception occurs. Furthermore, we have added a new test case,
<code>test_odd_partial_failed</code>, to the unit tests, ensuring that
the <code>gather</code> function handles partial functions that raise
errors correctly. The Python version required for this project remains
at 3.10, and the <code>pyproject.toml</code> file has been updated to
include &quot;isort&quot;, &quot;mypy&quot;, &quot;types-PyYAML&quot;,
and <code>types-requests</code> in the list of dependencies. These
adjustments are aimed at improving the functionality and type checking
in the <code>parallel.Threads</code> module.</li>
<li>Align configurations with UCX project (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/96">#96</a>).
This commit brings project configurations in line with the UCX project
through various fixes and updates, enhancing compatibility and
streamlining collaboration. It addresses pylint configuration warnings,
adjusts GitHub Actions workflows, and refines the
<code>pyproject.toml</code> file. Additionally, the
<code>NiceFormatter</code> class in <code>logger.py</code> has been
improved for better code readability, and the versioning scheme has been
updated to ensure SemVer and PEP440 compliance, making it easier to
manage and understand the project's versioning. Developers adopting the
project will benefit from these alignments, as they promote adherence to
the project's standards and up-to-date best practices.</li>
<li>Check backwards compatibility with UCX, Remorph, and LSQL (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/84">#84</a>).
This release includes an update to the dependabot configuration to check
for daily updates in both the pip and github-actions package ecosystems,
with a new directory parameter added for the pip ecosystem for more
precise update management. Additionally, a new GitHub Actions workflow,
&quot;downstreams&quot;, has been added to ensure backwards
compatibility with UCX, Remorph, and LSQL by running automated
downstream checks on pull requests, merge groups, and pushes to the main
branch. The workflow has appropriate permissions for writing id-tokens,
reading contents, and writing pull-requests, and runs the downstreams
action from the databrickslabs/sandbox repository using GITHUB_TOKEN for
authentication. These changes improve the security and maintainability
of the project by ensuring compatibility with downstream projects and
staying up-to-date with the latest package versions, reducing the risk
of potential security vulnerabilities and bugs.</li>
</ul>
<p>Dependency updates:</p>
<ul>
<li>Bump actions/setup-python from 4 to 5 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/89">#89</a>).</li>
<li>Bump softprops/action-gh-release from 1 to 2 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/87">#87</a>).</li>
<li>Bump actions/checkout from 2.5.0 to 4.1.2 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/88">#88</a>).</li>
<li>Bump codecov/codecov-action from 1 to 4 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/85">#85</a>).</li>
<li>Bump actions/checkout from 4.1.2 to 4.1.3 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/95">#95</a>).</li>
<li>Bump actions/checkout from 4.1.3 to 4.1.5 (<a
href="https://redirect.github.com/databrickslabs/blueprint/pull/100">#100</a>).</li>
</ul>
<h2>0.4.4</h2>
<ul>
<li>If <code>Threads.strict()</code> raises just one error, don't wrap
it with <code>ManyError</code> (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/79">#79</a>).
The <code>strict</code> method in the <code>gather</code> function of
the <code>parallel.py</code> module in the
<code>databricks/labs/blueprint</code> package has been updated to
change the way it handles errors. Previously, if any task in the
<code>tasks</code> sequence failed, the <code>strict</code> method would
raise a <code>ManyError</code> exception containing all the errors. With
this change, if only one error occurs, that error will be raised
directly without being wrapped in a <code>ManyError</code> exception.
This simplifies error handling and avoids unnecessary nesting of
exceptions. Additionally, the <code>__tracebackhide__</code> dunder
variable has been added to the method to improve the readability of
tracebacks by hiding it from the user. This update aims to provide a
more streamlined and user-friendly experience for handling errors in
parallel processing tasks.</li>
</ul>
<h2>0.4.3</h2>
<ul>
<li>Fixed marshalling &amp; unmarshalling edge cases (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/76">#76</a>).
The serialization and deserialization methods in the code have been
updated to improve handling of edge cases during marshalling and
unmarshalling of data. When encountering certain edge cases, the
<code>_marshal_list</code> method will now return an empty list instead
of None, and both the <code>_unmarshal</code> and
<code>_unmarshal_dict</code> methods will return None as is if the input
is None. Additionally, the <code>_unmarshal</code> method has been
updated to call <code>_unmarshal_generic</code> instead of checking if
the type reference is a dictionary or list when it is a generic alias.
The <code>_unmarshal_generic</code> method has also been updated to
handle cases where the input is None. A new test case,
<code>test_load_empty_data_class()</code>, has been added to the
<code>tests/unit/test_installation.py</code> file to verify this
behavior, ensuring that the correct behavior is maintained when
encountering these edge cases during the marshalling and unmarshalling
processes. These changes increase the reliability of the serialization
and deserialization processes.</li>
</ul>
<h2>0.4.2</h2>
<ul>
<li>Fixed edge cases when loading typing.Dict, typing.List and
typing.ClassVar (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/74">#74</a>).
In this release, we have implemented changes to improve the handling of
edge cases related to the Python <code>typing.Dict</code>,
<code>typing.List</code>, and <code>typing.ClassVar</code> during
serialization and deserialization of dataclasses and generic types.
Specifically, we have modified the <code>_marshal</code> and
<code>_unmarshal</code> functions to check for the
<code>__origin__</code> attribute to determine whether the type is a
<code>ClassVar</code> and skip it if it is. The
<code>_marshal_dataclass</code> and <code>_unmarshal_dataclass</code>
functions now check for the <code>__dataclass_fields__</code> attribute
to ensure that only dataclass fields are marshaled and unmarshaled. We
have also added a new unit test for loading a complex data class using
the <code>MockInstallation</code> class, which contains various
attributes such as a string, a nested dictionary, a list of
<code>Policy</code> objects, and a dictionary mapping string keys to
<code>Policy</code> objects. This test case checks that the installation
object correctly serializes and deserializes the
<code>ComplexClass</code> instance to and from JSON format according to
the specified attribute types, including handling of the
<code>typing.Dict</code>, <code>typing.List</code>, and
<code>typing.ClassVar</code> types. These changes improve the
reliability and robustness of our library in handling complex data types
defined in the <code>typing</code> module.</li>
<li><code>MockPrompts.extend()</code> now returns a copy (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/72">#72</a>).
In the latest release, the <code>extend()</code> method in the
<code>MockPrompts</code> class of the <code>tui.py</code> module has
been enhanced. Previously, <code>extend()</code> would modify the
original <code>MockPrompts</code> object, which could lead to issues
when reusing the same object in multiple places within the same test, as
its state would be altered each time <code>extend()</code> was called.
This has been addressed by updating the <code>extend()</code> method to
return a copy of the <code>MockPrompts</code> object with the updated
patterns and answers, instead of modifying the original object. This
change ensures that the original <code>MockPrompts</code> object can be
securely reused in multiple test scenarios without unintended side
effects, preserving the integrity of the original state. Furthermore,
additional tests have been incorporated to verify the correct behavior
of both the new and original prompts.</li>
</ul>
<h2>0.4.1</h2>
<ul>
<li>Fixed <code>MockInstallation</code> to emulate workspace-global
setup (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/69">#69</a>).
In this release, the <code>MockInstallation</code> class in the
<code>installation</code> module has been updated to better replicate a
workspace-global setup, enhancing testing and development accuracy. The
<code>is_global</code> method now utilizes the <code>product</code>
method instead of <code>_product</code>, and a new instance variable
<code>_is_global</code> with a default value of <code>True</code> is
introduced in the <code>__init__</code> method. Moreover, a new
<code>product</code> method is included, which consistently returns the
string &quot;mock&quot;. These enhancements resolve issue <a
href="https://redirect.github.com/databrickslabs/blueprint/issues/69">#69</a>,
&quot;Fixed <code>MockInstallation</code> to emulate workspace-global
setup&quot;, ensuring the <code>MockInstallation</code> instance behaves
as a global installation, facilitating precise and reliable testing and
development for our software engineering team.</li>
<li>Improved <code>MockPrompts</code> with <code>extend()</code> method
(<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/68">#68</a>).
In this release, we've added an <code>extend()</code> method to the
<code>MockPrompts</code> class in our library's TUI module. This new
method allows developers to add new patterns and corresponding answers
to the existing list of questions and answers in a
<code>MockPrompts</code> object. The added patterns are compiled as
regular expressions and the questions and answers list is sorted by the
length of the regular expression patterns in descending order. This
feature is particularly useful for writing tests where prompt answers
need to be changed, as it enables better control and customization of
prompt responses during testing. By extending the list of questions and
answers, you can handle additional prompts without modifying the
existing ones, resulting in more organized and maintainable test code.
If a prompt hasn't been mocked, attempting to ask a question with it
will raise a <code>ValueError</code> with an appropriate error
message.</li>
<li>Use Hatch v1.9.4 to as build machine requirement (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/70">#70</a>).
The Hatch package version for the build machine requirement has been
updated from 1.7.0 to 1.9.4 in this change. This update streamlines the
Hatch setup and version management, removing the specific installation
step and listing <code>hatch</code> directly in the required field. The
pre-setup command now only includes &quot;hatch env create&quot;.
Additionally, the acceptance tool version has been updated to ensure
consistent project building and testing with the specified Hatch
version. This change is implemented in the acceptance workflow file and
the version of the acceptance tool used by the sandbox. This update
ensures that the project can utilize the latest features and bug fixes
available in Hatch 1.9.4, improving the reliability and efficiency of
the build process. This change is part of the resolution of issue <a
href="https://redirect.github.com/databrickslabs/blueprint/issues/70">#70</a>.</li>
</ul>
<h2>0.4.0</h2>
<ul>
<li>Added commands with interactive prompts (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/66">#66</a>).
This commit introduces a new feature in the Databricks Labs project to
support interactive prompts in the command-line interface (CLI) for
enhanced user interactivity. The <code>Prompts</code> argument, imported
from <code>databricks.labs.blueprint.tui</code>, is now integrated into
the <code>@app.command</code> decorator, enabling the creation of
commands with user interaction like confirmation prompts. An example of
this is the <code>me</code> command, which confirms whether the user
wants to proceed before displaying the current username. The commit also
refactored the code to make it more efficient and maintainable, removing
redundancy in creating client instances. The <code>AccountClient</code>
and <code>WorkspaceClient</code> instances can now be provided
automatically with the product name and version. These changes improve
the CLI by making it more interactive, user-friendly, and adaptable to
various use cases while also optimizing the codebase for better
efficiency and maintainability.</li>
<li>Added more code documentation (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/64">#64</a>).
This release introduces new features and updates to various files in the
open-source library. The <code>cli.py</code> file in the
<code>src/databricks/labs/blueprint</code> directory has been updated
with a new decorator, <code>command</code>, which registers a function
as a command. The <code>entrypoint.py</code> file in the
<code>databricks.labs.blueprint</code> module now includes a
module-level docstring describing its purpose, as well as documentation
for the various standard libraries it imports. The
<code>Installation</code> class in the <code>installers.py</code> file
has new methods for handling files, such as <code>load</code>,
<code>load_or_default</code>, <code>upload</code>,
<code>load_local</code>, and <code>files</code>. The
<code>installers.py</code> file also includes a new
<code>InstallationState</code> dataclass, which is used to track
installations. The <code>limiter.py</code> file now includes code
documentation for the <code>RateLimiter</code> class and the
<code>rate_limited</code> decorator, which are used to limit the rate of
requests. The <code>logger.py</code> file includes a new
<code>NiceFormatter</code> class, which provides a nicer format for
logging messages with colors and bold text if the console supports it.
The <code>parallel.py</code> file has been updated with new methods for
running tasks in parallel and returning results and errors. The
<code>TUI.py</code> file has been documented, and includes imports for
logging, regular expressions, and collections abstract base class.
Lastly, the <code>upgrades.py</code> file has been updated with
additional code documentation and new methods for loading and applying
upgrade scripts. Overall, these changes improve the functionality,
maintainability, and usability of the open-source library.</li>
<li>Fixed init-project command (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/65">#65</a>).
In this release, the <code>init-project</code> command has been improved
with several bug fixes and new functionalities. A new import statement
for the <code>sys</code> module has been added, and a <code>docs</code>
directory is now included in the copied directories and files during
initialization. The <code>init_project</code> function has been updated
to open files using the default system encoding, ensuring proper reading
and writing of file contents. The <code>relative_paths</code> function
in the <code>entrypoint.py</code> file now returns absolute paths if the
common path is the root directory, addressing issue <a
href="https://redirect.github.com/databrickslabs/blueprint/issues/41">#41</a>.
Additionally, several test functions have been added to
<code>tests/unit/test_entrypoint.py</code>, enhancing the reliability
and robustness of the <code>init-project</code> command by providing
comprehensive tests for supporting functions. Overall, these changes
significantly improve the functionality and reliability of the
<code>init-project</code> command, ensuring a more consistent and
accurate project initialization process.</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/2b75d24a0c428d00cbe76154043223a14e202475"><code>2b75d24</code></a>
Release v0.6.0 (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/104">#104</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/41e4aab2fea7735a9be742d16edfcc63b04fce11"><code>41e4aab</code></a>
Fixed bug for double-uploading of unreleased wheels in air-gapped setups
(<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/103">#103</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/50b54749631901beb2fbf6c116697d2dd0562be4"><code>50b5474</code></a>
Added upstream wheel uploads for Databricks Workspaces without Public
Interne...</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/c9593676c2aab450ab8225829520f233aa539812"><code>c959367</code></a>
Release v0.5.0 (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/102">#102</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/47ab384543af1ce512fb12bc664d810f410fbeea"><code>47ab384</code></a>
Bump actions/checkout from 4.1.3 to 4.1.5 (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/100">#100</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/aa3bf8cc1e417883b25799f598631d59230b9931"><code>aa3bf8c</code></a>
Added content assertion for <code>assert_file_uploaded</code> and
`assert_file_dbfs_uplo...</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/a5a85631b46207ee1cec4b3e3cddae14e2b1aedf"><code>a5a8563</code></a>
Align configurations with UCX project (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/96">#96</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/43add0bd89c1bf80f80f05dae8fcd0173e9aaad3"><code>43add0b</code></a>
Bump actions/checkout from 4.1.2 to 4.1.3 (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/95">#95</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/d2ceef7fdbf8b7998f58539d17f9a02022ce5744"><code>d2ceef7</code></a>
Handle partial functions in <code>parallel.Threads</code> (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/93">#93</a>)</li>
<li><a
href="https://github.com/databrickslabs/blueprint/commit/ea62287aee3eb5faa34d0a2f33a6b470cc803db1"><code>ea62287</code></a>
Bump codecov/codecov-action from 1 to 4 (<a
href="https://redirect.github.com/databrickslabs/blueprint/issues/85">#85</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/databrickslabs/blueprint/compare/v0.4.3...v0.6.0">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
  • Loading branch information
dependabot[bot] authored May 21, 2024
1 parent 7902156 commit 00434ee
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ classifiers = [

dependencies = ["databricks-sdk~=0.27.0",
"databricks-labs-lsql~=0.4.0",
"databricks-labs-blueprint>=0.4.3,<0.6.0",
"databricks-labs-blueprint>=0.4.3,<0.7.0",
"PyYAML>=6.0.0,<7.0.0",
"sqlglot>=23.9,<23.16"]

Expand Down

0 comments on commit 00434ee

Please sign in to comment.