Releases: datacontract/datacontract-cli
Releases · datacontract/datacontract-cli
v0.10.15
[0.10.15] - 2024-10-26
Added
- Support for model import from parquet file metadata.
- Great Expectation export: add optional args (#496)
suite_name
the name of the expectation suite to exportengine
used to run checkssql_server_type
to define the type of SQL Server to use when engine issql
- Changelog support for
Info
andTerms
blocks. datacontract import
now has--output
option for saving Data Contract to file- Enhance JSON file validation (local and S3) to return the first error for each JSON object, the max number of total errors can be configured via the environment variable:
DATACONTRACT_MAX_ERRORS
. Furthermore, the primaryKey will be additionally added to the error message. - fixes issue where records with no fields create an invalid bq schema.
Changed
- Changelog support for custom extension keys in
Models
andFields
blocks. datacontract catalog --files '*.yaml'
now checks also any subfolders for such files.- Optimize test output table on console if tests fail
Fixed
- raise valid exception in DataContractSpecification.from_file if file does not exist
- Fix importing JSON Schemas containing deeply nested objects without
required
array - SodaCL: Only add data quality tests for executable queries
v0.10.14
[0.10.14] - 2024-10-26
Data Contract CLI now supports the Open Data Contract Standard (ODCS) v3.0.0.
Added
datacontract test
now also supports ODCS v3 data contract formatdatacontract export --format odcs_v3
: Export to Open Data Contract Standard v3.0.0 (#460)datacontract test
now also supports ODCS v3 anda Data Contract SQL quality checks on field and model level- Support for import from Iceberg table definitions.
- Support for decimal logical type on avro export.
- Support for custom Trino types
Changed
datacontract import --format odcs
: Now supports ODSC v3.0.0 files (#474)datacontract export --format odcs
: Now creates v3.0.0 Open Data Contract Standard files (alias to odcs_v3). Old versions are still available as formatodcs_v2
. (#460)
Fixed
- fix timestamp serialization from parquet -> duckdb (#472)
v0.10.13
Prepare 0.10.13
v0.10.12
Added
- Support for import of DBML Models (#379)
datacontract export --format sqlalchemy
: Export to SQLAlchemy ORM models (#399)- Support of varchar max length in Glue import (#351)
datacontract publish
now also accepts theDATACONTRACT_MANAGER_API_KEY
as an environment variable- Support required fields for Avro schema export (#390)
- Support data type map in Spark import and export (#408)
- Support of enum on export to avro
- Support of enum title on avro import
Changed
- Deltalake is now using DuckDB's native deltalake support (#258). Extra deltalake removed.
- When dumping to YAML (import) the alias name is used instead of the pythonic name. (#373)
Fixed
- Fix an issue where the datacontract cli fails if installed without any extras (#400)
- Fix an issue where Glue database without a location creates invalid data contract (#351)
- Fix bigint -> long data type mapping (#351)
- Fix an issue where column description for Glue partition key column is ignored (#351)
- Corrected name of table parameter for bigquery import (#377)
- Fix a failed to connect to S3 Server (#384)
- Fix a model bug mismatching with the specification (
definitions.fields
) (#375) - Fix array type management in Spark import (#408)
v0.10.11
Added
- Support data type map in Glue import. (#340)
- Basic html export for new
keys
andvalues
fields. - Support for recognition of 1 to 1 relationships when exporting to DBML.
- Added support for arrays in JSON schema import (#305)
Changed
- Aligned JSON schema import and export of required properties
Fixed
- Fix required field handling in JSON schema import
- Fix an issue where the quality and definition
$ref
are not always resolved. - Fix an issue where the JSON schema validation fails for a field with type
string
and formatuuid
- Fix an issue where common DBML renderers may not be able to parse parts of an exported file.
v0.10.10
v0.10.9
[0.10.9] - 2024-07-03
Added
- Add support for Trino (#278)
- Spark export: add Spark StructType exporter (#277)
- add
--schema
option for thecatalog
andexport
command to provide the schema also locally - Integrate support into the pre-commit workflow. For further details, please refer to the information provided here.
- Improved HTML export, supporting links, tags, and more
- Add support for AWS SESSION_TOKEN (#309)
Changed
- Added array management on HTML export (#299)
Fixed
v0.10.8
Added
datacontract serve
start a local web server to provide a REST-API for the commands- Provide server for sql export for the appropriate schema (#153)
- Add struct and array management to Glue export (#271)
Changed
- Introduced optional dependencies/extras for significantly faster installation times. (#213)
- Added delta-lake as an additional optional dependency
- support
GOOGLE_APPLICATION_CREDENTIALS
as variable for connecting to bigquery indatacontract test
- better support bigqueries
type
attribute, don't assume all imported models are tables - added initial implementation of an importer from unity catalog (not all data types supported, yet)
- added the importer factory. This refactoring aims to make it easier to create new importers and consequently the growth and maintainability of the project. (#273)
Fixed
datacontract export --format avro
fixed array structure (#243)
v0.10.7
v0.10.6
[0.10.6] - 2024-05-29
Fixed
- Fixed Docker build by removing msodbcsql18 dependency (temporary workaround)