Skip to content

Releases: databrickslabs/blueprint

v0.8.2

06 Sep 13:30
@nfx nfx
c531c3f
Compare
Choose a tag to compare
  • Make hatch a prerequisite (#137). In version 1.9.4, hatch has become a prerequisite for installation in the GitHub workflow for the project's main branch, due to occasional failures in pip install hatch that depend on the local environment. This change, which includes defining the hatch version as an environment variable and adding a new step for installing hatch with a specific version, aims to enhance the reliability of the build and testing process by eliminating potential installation issues with hatch. Users should install hatch manually before executing the Makefile, as the line pip install hatch has been removed from the Makefile. This change aligns with the approach taken for ucx, and users are expected to understand the requirement to install prerequisites before executing the Makefile. To contribute to this project, please install hatch using pip install hatch, clone the GitHub repository, and run make dev to start the development environment and install necessary dependencies.
  • support files with unicode BOM (#138). The recent change to the open-source library introduces support for handling files with a Unicode Byte Order Mark (BOM) during file upload and download operations in Databricks Workspace. This new functionality, added to the WorkspacePath class, allows for easier reading of text from files with the addition of a read_text method. When downloading a file, if it starts with a BOM, it will be detected and used for decoding, regardless of the preferred encoding based on the system's locale. The change includes a new test function that verifies the accurate encoding and decoding of files with different types of BOM using the appropriate encoding. Despite the inability to test Databrick notebooks with a BOM due to the Databricks platform modifying the uploaded data, this change enhances support for handling files with various encodings and BOM, improving compatibility with a broader range of file formats, and ensuring more accurate handling of files with BOM.

Contributors: @ericvergnaud

v0.8.1

16 Jul 18:11
@nfx nfx
98e75bc
Compare
Choose a tag to compare
  • Fixed py3.10 compatibility for _parts in pathlike (#135). The recent update to our open-source library addresses the compatibility issue with Python 3.10 in the _parts property of a certain type. Prior to this change, there was also a _cparts property that returned the same value as _parts, which has been removed and replaced with a direct reference to _parts. The _parts property can now be accessed via reverse equality comparison, and this change has been implemented in the joinpath and __truediv__ methods as well. This enhancement improves the library's compatibility with Python 3.10 and beyond, ensuring continued functionality and stability for software engineers working with the latest Python versions.

Contributors: @nfx

v0.8.0

16 Jul 14:54
@nfx nfx
49c74a3
Compare
Choose a tag to compare
  • Added DBFSPath as os.PathLike implementation (#131). The open-source library has been updated with a new class DBFSPath, an implementation of os.PathLike for Databricks File System (DBFS) paths. This new class extends the existing WorkspacePath support and provides pathlib-like functionality for DBFS paths, including methods for creating directories, renaming and deleting files and directories, and reading and writing files. The addition of DBFSPath includes type-hinting for improved code linting and is integrated in the test suite with new and updated tests for path-like objects. The behavior of the exists and unlink methods have been updated for WorkspacePath to improve performance and raise appropriate errors.
  • Fixed .as_uri() and .absolute() implementations for WorkspacePath (#127). In this release, the WorkspacePath class in the paths.py module has been updated with several improvements to the .as_uri() and .absolute() methods. These methods now utilize PathLib internals, providing better cross-version compatibility. The .as_uri() method now uses an f-string for concatenation and returns the UTF-8 encoded string representation of the WorkspacePath object via a new __bytes__() dunder method. Additionally, the .absolute() method has been implemented for the trivial (no-op) case and now supports returning the absolute path of files or directories in Databricks Workspace. Furthermore, the glob() and rglob() methods have been enhanced to support case-sensitive pattern matching based on a new case_sensitive parameter. To ensure the integrity of these changes, two new test cases, test_as_uri() and test_absolute(), have been added, thoroughly testing the functionality of these methods.
  • Fixed WorkspacePath support for python 3.11 (#121). The WorkspacePath class in our open-source library has been updated to improve compatibility with Python 3.11. The .expanduser() and .glob() methods have been modified to address internal changes in Python 3.11. The is_dir() and is_file() methods now include a follow_symlinks parameter, although it is not currently used. A new method, _scandir(), has been added for compatibility with Python 3.11. The expanduser() method has also been updated to expand ~ (but not ~user) constructs. Additionally, a new method is_notebook() has been introduced to check if the path points to a notebook in Databricks Workspace. These changes aim to ensure that the library functions smoothly with the latest version of Python and provides additional functionality for users working with Databricks Workspace.
  • Properly verify versions of python (#118). In this release, we have made significant updates to the pyproject.toml file to enhance project dependency and development environment management. We have added several new packages to the dependencies section to expand the library's functionality and compatibility. Additionally, we have removed the python field, as it is no longer necessary. We have also updated the path field to specify the location of the virtual environment, which can improve integration with popular development tools such as Visual Studio Code and PyCharm. These changes are intended to streamline the development process and make it easier to manage dependencies and set up the development environment.
  • Type annotations on path-related unit tests (#128). In this open-source library update, type annotations have been added to path-related unit tests to enhance code clarity and maintainability. The tests encompass various scenarios, including verifying if a path exists, creating, removing, and checking directories, and testing file attributes such as distinguishing directories, notebooks, and regular files. The additions also cover functionality for opening and manipulating files in different modes like read binary, write binary, read text, and write text. Furthermore, tests for checking file permissions, handling errors, and globbing (pattern-based file path matching) have been incorporated. The tests interact with a WorkspaceClient mock object, simulating file system interactions. This enhancement bolsters the library's reliability and assists developers in creating robust, well-documented code when working with file system paths.
  • Updated WorkspacePath to support Python 3.12 (#122). In this release, the WorkspacePath implementation has been updated to ensure compatibility with Python 3.12, in addition to Python 3.10 and 3.11. The class was modified to replace most of the internal implementation and add extensive tests for public interfaces, ensuring that the superclass implementations are not used unless they are known to be safe. This change is in response to the significant changes in the superclass implementations between Python 3.11 and 3.12, which were found to be incompatible with each other. The WorkspacePath class now includes several new methods and tests to ensure that it functions seamlessly with different versions of Python. These changes include testing for initialization, equality, hash, comparison, path components, and various path manipulations. This update enhances the library's adaptability and ensures it functions correctly with different versions of Python. Classifiers have also been updated to include support for Python 3.12.
  • WorkspacePath fixes for the .resolve() implementation (#129). The .resolve() method for WorkspacePath has been updated to improve its handling of relative paths and the strict argument. Previously, relative paths were not properly validated and would be returned as-is. Now, relative paths will cause the method to fail. The strict argument is now checked, and if set to True and the path does not exist, a FileNotFoundError will be raised. The method .absolute() is used to obtain the absolute path of the file or directory in Databricks Workspace and is used in the implementation of .resolve(). A new test, test_resolve(), has been added to verify these changes, covering scenarios where the path is absolute, the path exists, the path does not exist, and the path is relative. In the case of relative paths, a NotImplementedError is raised, as .resolve() is not supported for them.
  • WorkspacePath: Fix the .rename() and .replace() implementations to return the target path (#130). The .rename() and .replace() methods of the WorkspacePath class have been updated to return the target path as part of the public API, with .rename() no longer accepting the overwrite keyword argument and always failing if the target path already exists. A new private method, ._rename(), has been added to include the overwrite argument and is used by both .rename() and .replace(). This update is a preparatory step for factoring out common code to support DBFS paths. The tests have been updated accordingly, combining and adding functions to test the new and updated methods. The .unlink() method's behavior remains unchanged. Please note that the exact error raised when .rename() fails due to an existing target path is yet to be defined.

Dependency updates:

  • Bump sigstore/gh-action-sigstore-python from 2.1.1 to 3.0.0 (#133).

Contributors: @asnare, @nfx, @dependabot[bot]

v0.7.0

05 Jul 10:29
@nfx nfx
d02c9ae
Compare
Choose a tag to compare
  • Added databricks.labs.blueprint.paths.WorkspacePath as pathlib.Path equivalent (#115). This commit introduces the databricks.labs.blueprint.paths.WorkspacePath library, providing Python-native pathlib.Path-like interfaces to simplify working with Databricks Workspace paths. The library includes WorkspacePath and WorkspacePathDuringTest classes offering advanced functionality for handling user home folders, relative file paths, browser URLs, and file manipulation methods such as read/write_text(), read/write_bytes(), and glob(). This addition brings enhanced, Pythonic ways to interact with Databricks Workspace paths, including creating and moving files, managing directories, and generating browser-accessible URIs. Additionally, the commit includes updates to existing methods and introduces new fixtures for creating notebooks, accompanied by extensive unit tests to ensure reliability and functionality.
  • Added propagation of blueprint version into User-Agent header when it is used as library (#114). A new feature has been introduced in the library that allows for the propagation of the blueprint version and the name of the command line interface (CLI) command used in the User-Agent header when the library is utilized as a library. This feature includes the addition of two new pairs of OtherInfo: blueprint/X.Y.Z to indicate that the request is made using the blueprint library and cmd/<name> to store the name of the CLI command used for making the request. The implementation involves using the with_user_agent_extra function from databricks.sdk.config to set the user agent consistently with the Databricks CLI. Several changes have been made to the test file for test_useragent.py to include a new test case, test_user_agent_is_propagated, which checks if the blueprint version and the name of the command are correctly propagated to the User-Agent header. A context manager http_fixture_server has been added that creates an HTTP server with a custom handler, which extracts the blueprint version and the command name from the User-Agent header and stores them in the user_agent dictionary. The test case calls the foo command with a mocked WorkspaceClient instance and sets the DATABRICKS_HOST and DATABRICKS_TOKEN environment variables to test the propagation of the blueprint version and the command name in the User-Agent header. The test case then asserts that the blueprint version and the name of the command are present and correctly set in the user_agent dictionary.
  • Bump actions/checkout from 4.1.6 to 4.1.7 (#112). In this release, the version of the "actions/checkout" action used in the Checkout Code step of the acceptance workflow has been updated from 4.1.6 to 4.1.7. This update may include bug fixes, performance improvements, and new features, although specific changes are not mentioned in the commit message. The Unshallow step remains unchanged, continuing to fetch and clean up the repository's history. This update ensures that the latest enhancements from the "actions/checkout" action are utilized, aiming to improve the reliability and performance of the code checkout process in the GitHub Actions workflow. Software engineers should be aware of this update and its potential impact on their workflows.

Dependency updates:

  • Bump actions/checkout from 4.1.6 to 4.1.7 (#112).

Contributors: @nfx, @dependabot[bot]

v0.6.3

30 May 20:21
@nfx nfx
5d65c32
Compare
Choose a tag to compare
  • fixed Command.get_argument_type bug with UnionType (#110). In this release, the Command.get_argument_type method has been updated to include special handling for UnionType, resolving a bug that caused the function to crash when encountering this type. The method now returns the string representation of the annotation if the argument is a UnionType, providing more accurate and reliable results. To facilitate this, modifications were made using the types module. Additionally, the foo function has a new optional argument optional_arg of type str, with a default value of None. This argument is passed to the some function in the assertion. The Prompts type has been added to the foo function signature, and an assertion has been added to verify if prompts is an instance of Prompts. Lastly, the default value of the address argument has been changed from an empty string to "default", and the same changes have been applied to the test_injects_prompts test function.

Contributors: @nkvuong

v0.6.2

23 May 15:33
@nfx nfx
be215f1
Compare
Choose a tag to compare
  • Applied type casting & remove empty kwarg for Command (#108). A new method, get_argument_type, has been added to the Command class in the cli.py file to determine the type of a given argument name based on the function's signature. The _route method has been updated to remove any empty keyword arguments from the kwargs dictionary, and apply type casting based on the argument type using the get_argument_type method. This ensures that the kwargs passed into App.command are correctly typed and eliminates any empty keyword arguments, which were previously passed as empty strings. In the test file for the command-line interface, the foo command's keyword arguments have been updated to include age (int), salary (float), is_customer (bool), and address (str) types, with the name argument remaining and a default value for address. The test_commands and test_injects_prompts functions have been updated accordingly. These changes aim to improve the input validation and type safety of the App.command method.

Contributors: @nkvuong

v0.6.1

21 May 10:45
@nfx nfx
cc19164
Compare
Choose a tag to compare
  • Made ProductInfo.version a cached_property to avoid failure when comparing wheel uploads in development (#105). In this release, the apply method of a class has been updated to sort upgrade scripts in semantic versioning order before applying them, addressing potential issues with version comparison during development. The implementation of ProductInfo.version has been refactored to a cached_property called _version, which calculates and caches the project version, addressing a failure during wheel upload comparisons in development. The Wheels class constructor has also been updated to include explicit keyword-only arguments, and a deprecation warning has been added. These changes aim to improve the reliability and predictability of the upgrade process and the library as a whole.

Dependency updates:

  • Bump actions/checkout from 4.1.5 to 4.1.6 (#106).

Contributors: @dependabot[bot], @nkvuong

v0.6.0

12 May 10:35
@nfx nfx
2b75d24
Compare
Choose a tag to compare
  • Added upstream wheel uploads for Databricks Workspaces without Public Internet access (#99). This commit introduces a new feature for uploading upstream wheel dependencies to Databricks Workspaces without Public Internet access. A new flag has been added to upload functions, allowing users to include or exclude dependencies in the download list. The WheelsV2 class has been updated with a new method, upload_wheel_dependencies(prefixes), which checks if each wheel's name starts with any of the provided prefixes before uploading it to the Workspace File System (WSFS). This feature also includes two new tests to verify the functionality of uploading the main wheel package and dependent wheel packages, optimizing downloads based on specific use cases. This enables users to more easily use the package in offline environments with restricted internet access, particularly for Databricks Workspaces with extra layers of network security.
  • Fixed bug for double-uploading of unreleased wheels in air-gapped setups (#103). In this release, we have addressed a bug in the upload_wheel_dependencies method of the WheelsV2 class, which caused double-uploading of unreleased wheels in air-gapped setups. This issue occurred due to the condition if wheel.name == self._local_wheel.name not being met, resulting in undefined behavior. We have introduced a cached property _current_version to tackle this bug for unreleased versions uploaded to air-gapped workspaces. We also added a new method, upload_to_wsfs(), that uploads files to the workspace file system (WSFS) in the integration test. This release also includes new tests to ensure that only the Databricks SDK is uploaded and that the number of installation files is correct. These changes have resolved the double-uploading issue, and the number of installation files, Databricks SDK, Blueprint, and version.json metadata are now uploaded correctly to WSFS.

Contributors: @aminmovahed-db, @nfx

v0.5.0

08 May 11:08
@nfx nfx
c959367
Compare
Choose a tag to compare
  • Added content assertion for assert_file_uploaded and assert_file_dbfs_uploaded in MockInstallation (#101). The recent commit introduces a content assertion feature to the MockInstallation class, enhancing its testing capabilities. This is achieved by adding an optional expected parameter of type bytes to the assert_file_uploaded and assert_file_dbfs_uploaded methods, allowing users to verify the uploaded content's correctness. The _assert_upload method has also been updated to accept this new parameter, ensuring the actual uploaded content matches the expected content. Furthermore, the commit includes informative docstrings for the new and updated methods, providing clear explanations of their functionality and usage. To support these improvements, new test cases test_assert_file_uploaded and test_load_empty_data_class have been added to the tests/unit/test_installation.py file, enabling more rigorous testing of the MockInstallation class and ensuring that the expected content is uploaded correctly.
  • Added handling for partial functions in parallel.Threads (#93). In this release, we have enhanced the parallel.Threads module with the ability to handle partial functions, addressing issue #93. This improvement includes the addition of a new static method, _get_result_function_signature, to obtain the signature of a function or a string representation of its arguments and keywords if it is a partial function. The _wrap_result class method has also been updated to log an error message with the function's signature if an exception occurs. Furthermore, we have added a new test case, test_odd_partial_failed, to the unit tests, ensuring that the gather function handles partial functions that raise errors correctly. The Python version required for this project remains at 3.10, and the pyproject.toml file has been updated to include "isort", "mypy", "types-PyYAML", and types-requests in the list of dependencies. These adjustments are aimed at improving the functionality and type checking in the parallel.Threads module.
  • Align configurations with UCX project (#96). This commit brings project configurations in line with the UCX project through various fixes and updates, enhancing compatibility and streamlining collaboration. It addresses pylint configuration warnings, adjusts GitHub Actions workflows, and refines the pyproject.toml file. Additionally, the NiceFormatter class in logger.py has been improved for better code readability, and the versioning scheme has been updated to ensure SemVer and PEP440 compliance, making it easier to manage and understand the project's versioning. Developers adopting the project will benefit from these alignments, as they promote adherence to the project's standards and up-to-date best practices.
  • Check backwards compatibility with UCX, Remorph, and LSQL (#84). This release includes an update to the dependabot configuration to check for daily updates in both the pip and github-actions package ecosystems, with a new directory parameter added for the pip ecosystem for more precise update management. Additionally, a new GitHub Actions workflow, "downstreams", has been added to ensure backwards compatibility with UCX, Remorph, and LSQL by running automated downstream checks on pull requests, merge groups, and pushes to the main branch. The workflow has appropriate permissions for writing id-tokens, reading contents, and writing pull-requests, and runs the downstreams action from the databrickslabs/sandbox repository using GITHUB_TOKEN for authentication. These changes improve the security and maintainability of the project by ensuring compatibility with downstream projects and staying up-to-date with the latest package versions, reducing the risk of potential security vulnerabilities and bugs.

Dependency updates:

  • Bump actions/setup-python from 4 to 5 (#89).
  • Bump softprops/action-gh-release from 1 to 2 (#87).
  • Bump actions/checkout from 2.5.0 to 4.1.2 (#88).
  • Bump codecov/codecov-action from 1 to 4 (#85).
  • Bump actions/checkout from 4.1.2 to 4.1.3 (#95).
  • Bump actions/checkout from 4.1.3 to 4.1.5 (#100).

Contributors: @dependabot[bot], @nfx, @grusin-db, @nkvuong

v0.4.4

27 Mar 23:08
@nfx nfx
5bc05bb
Compare
Choose a tag to compare
  • If Threads.strict() raises just one error, don't wrap it with ManyError (#79). The strict method in the gather function of the parallel.py module in the databricks/labs/blueprint package has been updated to change the way it handles errors. Previously, if any task in the tasks sequence failed, the strict method would raise a ManyError exception containing all the errors. With this change, if only one error occurs, that error will be raised directly without being wrapped in a ManyError exception. This simplifies error handling and avoids unnecessary nesting of exceptions. Additionally, the __tracebackhide__ dunder variable has been added to the method to improve the readability of tracebacks by hiding it from the user. This update aims to provide a more streamlined and user-friendly experience for handling errors in parallel processing tasks.

Contributors: @nfx