2020 MAY 25
-
Adds
OVERWRITE
user parameter to be used with Export UDF. #36 #81 -
Adds compression codec as an additional extension to the exported file. #82 #83
2020 MAR 27
2020 MAR 16
-
Adds initial system requirements document for the upcoming Amazon Kinesis Connector. #72 #73
-
Improve the user guide document with an additional Exasol connection object usage example. #66 #76
-
Updated dependencies to make the project run with IntelliJ IDEA. #70 #71
2020 FEB 17
-
Fixes a bug with regex pattern matching on the paths. #65 #67
-
Adds list of dependencies with their licenses to the README.md. #63 #64
-
Adds a logo to the project. #60
2019 DEC 06
-
Improves Azure Blob Storage parameters, do not enforce requirement of
AZURE_ACCOUNT_NAME
andAZURE_CONTAINER_NAME
properties since they are available in the path. #50 #55 -
Fixes bug when importing Parquet
INT64 (TIMESTAMP_MILLIS)
type values. #52 #53
2019 OCT 31 🎃
-
Adds Apache Kafka consumer UDF to import Avro formatted data from Kafka clusters. #40 #39 #48
-
Adds several new Kafka consumer settings as user provided UDF parameters. #41 #43
-
Refactors the UDF user properties handling in order to make more robust and maintainable. #46
-
Reworks the documentation to incorporate the new Kafka import changes. #45 #47
2019 OCT 27
2019 JUL 01
2019 MAY 28
-
Adds batch size parameter for export that helps exporting large tables. #28 #32
-
Applied some refactoring. #29
2019 APR 15
2019 MAR 21
2019 MAR 08
2019 FEB 12
-
Adds feature to export Exasol tables into AWS S3 as Parquet format. #14 #16
-
Imports date and timestamp values correctly into Exasol. #14 #16
2018 DEC 13
2018 DEC 06
- Add initial Azure Blob Store import support. #10
2018 DEC 05
- Add initial Google Compute Storage (GCS) import functionality. #9
2018 NOV 28
- Added AWS S3 endpoint as a parameter. #8
2018 NOV 27
-
Add Travis based automated build. #3
-
Renamed packages and external files to more generic cloudetl from s3etl. #4
-
Added automated Github release when a tag is pushed. #5
2018 NOV 23
- Initial release with only AWS S3 and Parquet format with only primitive types support