Skip to content

Commit

Permalink
Release 0.3.19 (#1283)
Browse files Browse the repository at this point in the history
# Description

Please describe the change you have made.

## Checklist

- [ ] Tests added/updated.
- [ ] Run Demo Job Locally.
- [ ] Documentation updated.
- [ ] Changelogs updated in
[CHANGELOG.cdf-tk.md](https://github.com/cognitedata/toolkit/blob/main/CHANGELOG.cdf-tk.md).
- [ ] Template changelogs updated in
[CHANGELOG.templates.md](https://github.com/cognitedata/toolkit/blob/main/CHANGELOG.templates.md).
- [ ] Version bumped.

[_version.py](https://github.com/cognitedata/toolkit/blob/main/cognite/cognite_toolkit/_version.py)
and

[pyproject.toml](https://github.com/cognitedata/toolkit/blob/main/pyproject.toml)
per [semantic versioning](https://semver.org/).
  • Loading branch information
doctrino authored Dec 9, 2024
2 parents bea4e66 + eb9fe24 commit 679f193
Show file tree
Hide file tree
Showing 68 changed files with 920 additions and 568 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ repos:
- --fixable=E,W,F,I,T,RUF,TID,UP
- --target-version=py39
- id: ruff-format
rev: v0.8.1
rev: v0.8.2

- repo: https://github.com/igorshubovych/markdownlint-cli
rev: v0.43.0
Expand Down
26 changes: 26 additions & 0 deletions CHANGELOG.cdf-tk.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,32 @@ Changes are grouped as follows:
- `Fixed` for any bug fixes.
- `Security` in case of vulnerabilities.

## [0.3.19] - 2024-12-09

### Added

- [alpha feature] `cdf purge dataset` now supports purging resources with internal IDs.

### Fixed

- Replacing variables in an inline SQL query no longer removes the quotes around the variable.
- Running `cdf build` on an older module will no longer raise an `KeyError` if the `module.toml` does
not have a `package` key.
- [alpha feature] `cdf purge dataset` no longer deletes `LocationFilters`
- [alpha feature] `GraphQL` resources with views that specify a `rawFilter` no longer raise an error when
running `cdf deploy`.
- In the `cdf dump datamodel` command, properties that are overridden in a view are now correctly dumped.

### Changed

- [alpha feature] `cdf purge` now requires a confirmation before deleting resources.
- Building a `Transformation` will store the `.sql` file in the build directory instead of inlined in the
resource YAML file.

### Improved

- Consistent display names of resources in output table of `cdf deploy` and `cdf clean`.

## [0.3.18] - 2024-12-03

### Fixed
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.templates.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ Changes are grouped as follows:
- `Fixed` for any bug fixes.
- `Security` in case of vulnerabilities.

## [0.3.19] - 2024-12-09

No changes to templates.

## [0.3.18] - 2024-12-03

No changes to templates.
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ More details about the tool can be found at
[docs.cognite.com](https://docs.cognite.com/cdf/deploy/cdf_toolkit/).

You can find an overview of the modules and packages in the
[module and package documentation](https://docs.cognite.com/cdf/deploy/cdf_toolkit/references/module_reference).
[module and package documentation](https://docs.cognite.com/cdf/deploy/cdf_toolkit/references/resource_library).

See [./CONTRIBUTING.md](./CONTRIBUTING.md) for information about how to contribute to the `cdf-tk` tool or
templates.
2 changes: 1 addition & 1 deletion cdf.toml
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,4 @@ dump = true
[modules]
# This is the version of the modules. It should not be changed manually.
# It will be updated by the 'cdf module upgrade' command.
version = "0.3.18"
version = "0.3.19"
2 changes: 1 addition & 1 deletion cognite_toolkit/_builtin_modules/cdf.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ default_env = "<DEFAULT_ENV_PLACEHOLDER>"
[modules]
# This is the version of the modules. It should not be changed manually.
# It will be updated by the 'cdf module upgrade' command.
version = "0.3.18"
version = "0.3.19"


[plugins]
Expand Down
20 changes: 19 additions & 1 deletion cognite_toolkit/_cdf_tk/apps/_purge.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def __init__(self, *args: Any, **kwargs: Any) -> None:
def main(self, ctx: typer.Context) -> None:
"""Commands purge functionality"""
if ctx.invoked_subcommand is None:
print("Use [bold yellow]cdf pull --help[/] for more information.")
print("Use [bold yellow]cdf purge --help[/] for more information.")

def purge_dataset(
self,
Expand All @@ -44,6 +44,14 @@ def purge_dataset(
help="Whether to do a dry-run, do dry-run if present.",
),
] = False,
auto_yes: Annotated[
bool,
typer.Option(
"--yes",
"-y",
help="Automatically confirm that you are sure you want to purge the dataset.",
),
] = False,
verbose: Annotated[
bool,
typer.Option(
Expand All @@ -61,6 +69,7 @@ def purge_dataset(
external_id,
include_dataset,
dry_run,
auto_yes,
verbose,
)
)
Expand Down Expand Up @@ -90,6 +99,14 @@ def purge_space(
help="Whether to do a dry-run, do dry-run if present.",
),
] = False,
auto_yes: Annotated[
bool,
typer.Option(
"--yes",
"-y",
help="Automatically confirm that you are sure you want to purge the space.",
),
] = False,
verbose: Annotated[
bool,
typer.Option(
Expand All @@ -109,6 +126,7 @@ def purge_space(
space,
include_space,
dry_run,
auto_yes,
verbose,
)
)
4 changes: 2 additions & 2 deletions cognite_toolkit/_cdf_tk/builders/_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def validate_directory(
return WarningList[ToolkitWarning]()

# Helper methods
def _create_destination_path(self, source_path: Path, module_dir: Path, kind: str) -> Path:
def _create_destination_path(self, source_path: Path, kind: str) -> Path:
"""Creates the filepath in the build directory for the given source path.
Note that this is a complex operation as the modules in the source are nested while the build directory is flat.
Expand Down Expand Up @@ -153,7 +153,7 @@ def build(
if warning is not None:
yield [warning]
continue
destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

destination = BuildDestinationFile(
path=destination_path,
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_cdf_tk/builders/_datamodels.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def build(
yield [warning]
continue

destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

extra_sources: list[SourceLocation] | None = None
if loader is GraphQLLoader:
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_cdf_tk/builders/_file.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def build(
continue
if loader in {FileMetadataLoader, CogniteFileLoader}:
loaded = self._expand_file_metadata(loaded, module, console)
destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

yield BuildDestinationFile(
path=destination_path,
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_cdf_tk/builders/_function.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def build(
if loader is FunctionLoader:
warnings = self.copy_function_directory_to_build(source_file)

destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

yield BuildDestinationFile(
path=destination_path,
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_cdf_tk/builders/_raw.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ def build(
for loader, entries in entry_by_loader.items():
if not entries:
continue
destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

if loader is RawDatabaseLoader and has_split_table_and_database:
# We have inferred the database from a Table file, so we need to recalculate the hash
Expand Down
2 changes: 1 addition & 1 deletion cognite_toolkit/_cdf_tk/builders/_streamlit.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def build(
if loader is StreamlitLoader:
warnings = self.copy_app_directory_to_build(source_file)

destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
destination_path = self._create_destination_path(source_file.source.path, loader.kind)

yield BuildDestinationFile(
path=destination_path,
Expand Down
12 changes: 8 additions & 4 deletions cognite_toolkit/_cdf_tk/builders/_transformation.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,11 +37,11 @@ def build(
yield [warning]
continue

destination_path = self._create_destination_path(source_file.source.path, loader.kind)

extra_sources: list[SourceLocation] | None = None
if loader is TransformationLoader:
extra_sources = self._add_query(loaded, source_file, query_files)

destination_path = self._create_destination_path(source_file.source.path, module.dir, loader.kind)
extra_sources = self._add_query(loaded, source_file, query_files, destination_path)

destination = BuildDestinationFile(
path=destination_path,
Expand All @@ -57,6 +57,7 @@ def _add_query(
loaded: dict[str, Any] | list[dict[str, Any]],
source_file: BuildSourceFile,
query_files: dict[Path, BuildSourceFile],
transformation_destination_path: Path,
) -> list[SourceLocation]:
loaded_list = loaded if isinstance(loaded, list) else [loaded]
extra_sources: list[SourceLocation] = []
Expand All @@ -80,7 +81,10 @@ def _add_query(
filepath,
)
elif query_file is not None:
entry["query"] = query_file.content
destination_path = self._create_destination_path(query_file.source.path, "Query")
destination_path.write_text(query_file.content)
relative = destination_path.relative_to(transformation_destination_path.parent)
entry["query"] = relative.as_posix()
extra_sources.append(query_file.source)

return extra_sources
Expand Down
32 changes: 27 additions & 5 deletions cognite_toolkit/_cdf_tk/commands/_purge.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,15 +44,23 @@ def space(
space: str | None = None,
include_space: bool = False,
dry_run: bool = False,
auto_yes: bool = False,
verbose: bool = False,
) -> None:
"""Purge a space and all its content"""
selected_space = self._get_selected_space(space, ToolGlobals.toolkit_client)
self._print_panel("space", selected_space)
if space is None:
# Interactive mode
include_space = questionary.confirm("Do you also want to delete the space itself?", default=False).ask()
dry_run = questionary.confirm("Dry run?", default=True).ask()
if not dry_run:
self._print_panel("space", selected_space)
if not auto_yes:
confirm = questionary.confirm(
f"Are you really sure you want to purge the {selected_space!r} space?", default=False
).ask()
if not confirm:
return

loaders = self._get_dependencies(
SpaceLoader,
Expand Down Expand Up @@ -118,17 +126,25 @@ def dataset(
external_id: str | None = None,
include_dataset: bool = False,
dry_run: bool = False,
auto_yes: bool = False,
verbose: bool = False,
) -> None:
"""Purge a dataset and all its content"""
selected_dataset = self._get_selected_dataset(external_id, ToolGlobals.toolkit_client)
self._print_panel("dataset", selected_dataset)
if external_id is None:
# Interactive mode
include_dataset = questionary.confirm(
"Do you want to archive the dataset itself after the purge?", default=False
).ask()
dry_run = questionary.confirm("Dry run?", default=True).ask()
if not dry_run:
self._print_panel("dataset", selected_dataset)
if not auto_yes:
confirm = questionary.confirm(
f"Are you really sure you want to purge the {selected_dataset!r} dataset?", default=False
).ask()
if not confirm:
return

loaders = self._get_dependencies(
DataSetsLoader,
Expand All @@ -139,6 +155,7 @@ def dataset(
StreamlitLoader,
HostedExtractorDestinationLoader,
FunctionLoader,
LocationFilterLoader,
},
)
is_purged = self._purge(
Expand Down Expand Up @@ -245,9 +262,14 @@ def _purge(
try:
batch_ids.append(loader.get_id(resource))
except ToolkitRequiredValueError as e:
self.warn(HighSeverityWarning(f"Cannot delete {resource.dump()!r}. Failed to obtain ID: {e}"))
is_purged = False
continue
try:
batch_ids.append(loader.get_internal_id(resource))
except (AttributeError, NotImplementedError):
self.warn(
HighSeverityWarning(f"Cannot delete {type(resource).__name__}. Failed to obtain ID: {e}")
)
is_purged = False
continue

if len(batch_ids) >= batch_size:
child_deletion = self._delete_children(batch_ids, child_loaders, dry_run, verbose)
Expand Down
19 changes: 11 additions & 8 deletions cognite_toolkit/_cdf_tk/commands/build.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ def __init__(self, print_warning: bool = True, skip_tracking: bool = False, sile
defaultdict(list)
)
self._has_built = False
self._printed_variable_tree_structure_hint = False

def execute(
self,
Expand Down Expand Up @@ -485,14 +486,16 @@ def _check_variables_replaced(self, content: str, module: Path, source_path: Pat
if len(module_names) == 1
else (", ".join(module_names[:-1]) + f" or {module_names[-1]}")
)
self.console(
f"The variables in 'config.[ENV].yaml' need to be organised in a tree structure following"
f"\n the folder structure of the modules, but can also be moved up the config hierarchy to be shared between modules."
f"\n The variable {variable!r} is defined in the variable section{'s' if len(module_names) > 1 else ''} {module_str}."
f"\n Check that {'these paths reflect' if len(module_names) > 1 else 'this path reflects'} "
f"the location of {module.as_posix()}.",
prefix=" [bold green]Hint:[/] ",
)
if not self._printed_variable_tree_structure_hint:
self._printed_variable_tree_structure_hint = True
self.console(
f"The variables in 'config.[ENV].yaml' need to be organised in a tree structure following"
f"\n the folder structure of the modules, but can also be moved up the config hierarchy to be shared between modules."
f"\n The variable {variable!r} is defined in the variable section{'s' if len(module_names) > 1 else ''} {module_str}."
f"\n Check that {'these paths reflect' if len(module_names) > 1 else 'this path reflects'} "
f"the location of {module.as_posix()}.",
prefix=" [bold green]Hint:[/] ",
)
self.warning_list.extend(warning_list)
if self.print_warning and warning_list:
print(str(warning_list))
Expand Down
16 changes: 4 additions & 12 deletions cognite_toolkit/_cdf_tk/commands/dump.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@
from rich.panel import Panel

from cognite_toolkit._cdf_tk.exceptions import ToolkitMissingResourceError
from cognite_toolkit._cdf_tk.loaders import ViewLoader
from cognite_toolkit._cdf_tk.tk_warnings import MediumSeverityWarning
from cognite_toolkit._cdf_tk.utils import CDFToolConfig, retrieve_view_ancestors
from cognite_toolkit._cdf_tk.utils import CDFToolConfig

from ._base import ToolkitCommand

Expand Down Expand Up @@ -53,8 +54,6 @@ def execute(
space_ids = {item.space for item in itertools.chain(containers, views, [data_model])}
spaces = client.data_modeling.spaces.retrieve(list(space_ids))

views_by_id = {view.as_id(): view for view in views}

is_populated = output_dir.exists() and any(output_dir.iterdir())
if is_populated and clean:
shutil.rmtree(output_dir)
Expand Down Expand Up @@ -89,22 +88,15 @@ def execute(
suffix_version = len(views) != len({f"{view.space}{view.external_id}" for view in views})
view_folder = resource_folder / "views"
view_folder.mkdir(exist_ok=True)
view_loader = ViewLoader.create_loader(ToolGlobals, None)
for view in views:
file_name = f"{view.external_id}.view.yaml"
if prefix_space:
file_name = f"{view.space}_{file_name}"
if suffix_version:
file_name = f"{file_name.removesuffix('.view.yaml')}_{view.version}.view.yaml"
view_file = view_folder / file_name
view_write = view.as_write().dump()
parents = retrieve_view_ancestors(client, view.implements or [], views_by_id)
for parent in parents:
for prop_name in parent.properties.keys():
view_write["properties"].pop(prop_name, None)
if not view_write["properties"]:
# All properties were removed, so we remove the properties key.
view_write.pop("properties", None)

view_write = view_loader.dump_as_write(view)
view_file.write_text(yaml.safe_dump(view_write, sort_keys=False))
if verbose:
print(f" [bold green]INFO:[/] Dumped view {view.as_id()} to {view_file!s}.")
Expand Down
Loading

0 comments on commit 679f193

Please sign in to comment.