Skip to content

Releases: datacontract/datacontract-cli

v0.10.15

02 Dec 13:00
Compare
Choose a tag to compare

[0.10.15] - 2024-10-26

Added

  • Support for model import from parquet file metadata.
  • Great Expectation export: add optional args (#496)
    • suite_name the name of the expectation suite to export
    • engine used to run checks
    • sql_server_type to define the type of SQL Server to use when engine is sql
  • Changelog support for Info and Terms blocks.
  • datacontract import now has --output option for saving Data Contract to file
  • Enhance JSON file validation (local and S3) to return the first error for each JSON object, the max number of total errors can be configured via the environment variable: DATACONTRACT_MAX_ERRORS. Furthermore, the primaryKey will be additionally added to the error message.
  • fixes issue where records with no fields create an invalid bq schema.

Changed

  • Changelog support for custom extension keys in Models and Fields blocks.
  • datacontract catalog --files '*.yaml' now checks also any subfolders for such files.
  • Optimize test output table on console if tests fail

Fixed

  • raise valid exception in DataContractSpecification.from_file if file does not exist
  • Fix importing JSON Schemas containing deeply nested objects without required array
  • SodaCL: Only add data quality tests for executable queries

v0.10.14

26 Oct 19:56
Compare
Choose a tag to compare

[0.10.14] - 2024-10-26

Data Contract CLI now supports the Open Data Contract Standard (ODCS) v3.0.0.

Added

  • datacontract test now also supports ODCS v3 data contract format
  • datacontract export --format odcs_v3: Export to Open Data Contract Standard v3.0.0 (#460)
  • datacontract test now also supports ODCS v3 anda Data Contract SQL quality checks on field and model level
  • Support for import from Iceberg table definitions.
  • Support for decimal logical type on avro export.
  • Support for custom Trino types

Changed

  • datacontract import --format odcs: Now supports ODSC v3.0.0 files (#474)
  • datacontract export --format odcs: Now creates v3.0.0 Open Data Contract Standard files (alias to odcs_v3). Old versions are still available as format odcs_v2. (#460)

Fixed

  • fix timestamp serialization from parquet -> duckdb (#472)

v0.10.13

20 Sep 07:19
Compare
Choose a tag to compare
Prepare 0.10.13

v0.10.12

08 Sep 15:07
Compare
Choose a tag to compare

Added

  • Support for import of DBML Models (#379)
  • datacontract export --format sqlalchemy: Export to SQLAlchemy ORM models (#399)
  • Support of varchar max length in Glue import (#351)
  • datacontract publish now also accepts the DATACONTRACT_MANAGER_API_KEY as an environment variable
  • Support required fields for Avro schema export (#390)
  • Support data type map in Spark import and export (#408)
  • Support of enum on export to avro
  • Support of enum title on avro import

Changed

  • Deltalake is now using DuckDB's native deltalake support (#258). Extra deltalake removed.
  • When dumping to YAML (import) the alias name is used instead of the pythonic name. (#373)

Fixed

  • Fix an issue where the datacontract cli fails if installed without any extras (#400)
  • Fix an issue where Glue database without a location creates invalid data contract (#351)
  • Fix bigint -> long data type mapping (#351)
  • Fix an issue where column description for Glue partition key column is ignored (#351)
  • Corrected name of table parameter for bigquery import (#377)
  • Fix a failed to connect to S3 Server (#384)
  • Fix a model bug mismatching with the specification (definitions.fields) (#375)
  • Fix array type management in Spark import (#408)

v0.10.11

08 Aug 20:34
Compare
Choose a tag to compare

Added

  • Support data type map in Glue import. (#340)
  • Basic html export for new keys and values fields.
  • Support for recognition of 1 to 1 relationships when exporting to DBML.
  • Added support for arrays in JSON schema import (#305)

Changed

  • Aligned JSON schema import and export of required properties

Fixed

  • Fix required field handling in JSON schema import
  • Fix an issue where the quality and definition $ref are not always resolved.
  • Fix an issue where the JSON schema validation fails for a field with type string and format uuid
  • Fix an issue where common DBML renderers may not be able to parse parts of an exported file.

v0.10.10

18 Jul 15:30
Compare
Choose a tag to compare

[0.10.10] - 2024-07-18

Added

  • Add support for dbt manifest file (#104)
  • Fix import of pyspark for type-checking when pyspark isn't required as a module (#312)
  • Adds support for referencing fields within a definition (#322)
  • Add map and enum type for Avro schema import (#311)

Fixed

  • Fix import of pyspark for type-checking when pyspark isn't required as a module (#312)- datacontract import --format spark: Import from Spark tables (#326)
  • Fix an issue where specifying glue_table as parameter did not filter the tables and instead returned all tables from source database (#333)

v0.10.9

03 Jul 13:36
Compare
Choose a tag to compare

[0.10.9] - 2024-07-03

Added

  • Add support for Trino (#278)
  • Spark export: add Spark StructType exporter (#277)
  • add --schema option for the catalog and export command to provide the schema also locally
  • Integrate support into the pre-commit workflow. For further details, please refer to the information provided here.
  • Improved HTML export, supporting links, tags, and more
  • Add support for AWS SESSION_TOKEN (#309)

Changed

  • Added array management on HTML export (#299)

Fixed

  • Fix datacontract import --format jsonschema when description is missing (#300)
  • Fix datacontract test with case-sensitive Postgres table names (#310)

v0.10.8

19 Jun 08:04
Compare
Choose a tag to compare

Added

  • datacontract serve start a local web server to provide a REST-API for the commands
  • Provide server for sql export for the appropriate schema (#153)
  • Add struct and array management to Glue export (#271)

Changed

  • Introduced optional dependencies/extras for significantly faster installation times. (#213)
  • Added delta-lake as an additional optional dependency
  • support GOOGLE_APPLICATION_CREDENTIALS as variable for connecting to bigquery in datacontract test
  • better support bigqueries type attribute, don't assume all imported models are tables
  • added initial implementation of an importer from unity catalog (not all data types supported, yet)
  • added the importer factory. This refactoring aims to make it easier to create new importers and consequently the growth and maintainability of the project. (#273)

Fixed

  • datacontract export --format avro fixed array structure (#243)

v0.10.7

31 May 11:23
Compare
Choose a tag to compare

Added

  • Test data contract against dataframes / temporary views (#175)

Fixed

  • AVRO export: Logical Types should be nested (#233)

v0.10.6

29 May 10:17
Compare
Choose a tag to compare

[0.10.6] - 2024-05-29

Fixed

  • Fixed Docker build by removing msodbcsql18 dependency (temporary workaround)