Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for logs_archives resource #270

Merged
merged 4 commits into from
Sep 23, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
72 changes: 38 additions & 34 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -206,40 +206,42 @@ When running againts multiple destination organizations, a seperate working dire

#### Supported resources

| Resource | Description |
|----------------------------------------|----------------------------------------------------------|
| authn_mappings | Sync Datadog authn mappings. |
| dashboard_lists | Sync Datadog dashboard lists. |
| dashboards | Sync Datadog dashboards. |
| downtime_schedules | Sync Datadog downtimes. |
| downtimes (**deprecated**) | Sync Datadog downtimes. |
| host_tags | Sync Datadog host tags. |
| logs_custom_pipelines (**deprecated**) | Sync Datadog logs custom pipelines. |
| logs_indexes | Sync Datadog logs indexes. |
| logs_indexes_order | Sync Datadog logs indexes order. |
| logs_metrics | Sync Datadog logs metrics. |
| logs_pipelines | Sync Datadog logs OOTB integration and custom pipelines. |
| logs_pipelines_order | Sync Datadog logs pipelines order. |
| logs_restriction_queries | Sync Datadog logs restriction queries. |
| metric_percentiles | Sync Datadog metric percentiles. |
| metric_tag_configurations | Sync Datadog metric tags configurations. |
| metrics_metadata | Sync Datadog metric metadata. |
| monitors | Sync Datadog monitors. |
| notebooks | Sync Datadog notebooks. |
| powerpacks | Sync Datadog powerpacks. |
| restriction_policies | Sync Datadog restriction policies. |
| roles | Sync Datadog roles. |
| sensitive_data_scanner_groups | Sync SDS groups |
| sensitive_data_scanner_groups_order | Sync SDS groups order |
| sensitive_data_scanner_rules | Sync SDS rules |
| service_level_objectives | Sync Datadog SLOs. |
| slo_corrections | Sync Datadog SLO corrections. |
| spans_metrics | Sync Datadog spans metrics. |
| synthetics_global_variables | Sync Datadog synthetic global variables. |
| synthetics_private_locations | Sync Datadog synthetic private locations. |
| synthetics_tests | Sync Datadog synthetic tests. |
| teams | Sync Datadog teams (excluding users and permissions). |
| users | Sync Datadog users. |
| Resource | Description |
|----------------------------------------|----------------------------------------------------------------------|
| authn_mappings | Sync Datadog authn mappings. |
| dashboard_lists | Sync Datadog dashboard lists. |
| dashboards | Sync Datadog dashboards. |
| downtime_schedules | Sync Datadog downtimes. |
| downtimes (**deprecated**) | Sync Datadog downtimes. |
| host_tags | Sync Datadog host tags. |
| logs_archives | Sync Datadog logs archives. Requires GCP, Azure, or AWS integration. |
| logs_archives_order | Sync Datadog logs archives order. |
| logs_custom_pipelines (**deprecated**) | Sync Datadog logs custom pipelines. |
| logs_indexes | Sync Datadog logs indexes. |
| logs_indexes_order | Sync Datadog logs indexes order. |
| logs_metrics | Sync Datadog logs metrics. |
| logs_pipelines | Sync Datadog logs OOTB integration and custom pipelines. |
| logs_pipelines_order | Sync Datadog logs pipelines order. |
| logs_restriction_queries | Sync Datadog logs restriction queries. |
| metric_percentiles | Sync Datadog metric percentiles. |
| metric_tag_configurations | Sync Datadog metric tags configurations. |
| metrics_metadata | Sync Datadog metric metadata. |
| monitors | Sync Datadog monitors. |
| notebooks | Sync Datadog notebooks. |
| powerpacks | Sync Datadog powerpacks. |
| restriction_policies | Sync Datadog restriction policies. |
| roles | Sync Datadog roles. |
| sensitive_data_scanner_groups | Sync SDS groups |
| sensitive_data_scanner_groups_order | Sync SDS groups order |
| sensitive_data_scanner_rules | Sync SDS rules |
| service_level_objectives | Sync Datadog SLOs. |
| slo_corrections | Sync Datadog SLO corrections. |
| spans_metrics | Sync Datadog spans metrics. |
| synthetics_global_variables | Sync Datadog synthetic global variables. |
| synthetics_private_locations | Sync Datadog synthetic private locations. |
| synthetics_tests | Sync Datadog synthetic tests. |
| teams | Sync Datadog teams (excluding users and permissions). |
| users | Sync Datadog users. |

***Note:*** `logs_custom_pipelines` resource has been deprecated in favor of `logs_pipelines` resource which supports both logs OOTB integration and custom pipelines. To migrate to the new resource, rename the existing state files from `logs_custom_pipelines.json` to `logs_pipelines.json` for both source and destination files.

Expand All @@ -259,6 +261,8 @@ See [Supported resources](#supported-resources) section below for potential reso
| downtime_schedules | monitors |
| downtimes (**deprecated**) | monitors |
| host_tags | - |
| logs_archives | - (Requires manual setup of AWS, GCP or Azure integration) |
| logs_archives_order | logs_archives |
| logs_custom_pipelines (**deprecated**) | - |
| logs_indexes | - |
| logs_indexes_order | logs_indexes |
Expand Down
67 changes: 67 additions & 0 deletions datadog_sync/model/logs_archives.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
# Unless explicitly stated otherwise all files in this repository are licensed
# under the 3-clause BSD style license (see LICENSE).
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2019 Datadog, Inc.

from __future__ import annotations
from typing import TYPE_CHECKING, Optional, List, Dict, Tuple, cast

from datadog_sync.utils.base_resource import BaseResource, ResourceConfig

if TYPE_CHECKING:
from datadog_sync.utils.custom_client import CustomClient


class LogsArchives(BaseResource):
resource_type = "logs_archives"
resource_config = ResourceConfig(
base_path="/api/v2/logs/config/archives",
excluded_attributes=["id", "attributes.state"],
)
# Additional LogsArchives specific attributes

async def get_resources(self, client: CustomClient) -> List[Dict]:
resp = await client.get(self.resource_config.base_path)

return resp["data"]

async def import_resource(self, _id: Optional[str] = None, resource: Optional[Dict] = None) -> Tuple[str, Dict]:
if _id:
source_client = self.config.source_client
resource = (await source_client.get(self.resource_config.base_path + f"/{_id}"))["data"]
resource = cast(dict, resource)

return resource["id"], resource

async def pre_resource_action_hook(self, _id, resource: Dict) -> None:
pass

async def pre_apply_hook(self) -> None:
pass

async def create_resource(self, _id: str, resource: Dict) -> Tuple[str, Dict]:
destination_client = self.config.destination_client
payload = {"data": resource}
resp = await destination_client.post(self.resource_config.base_path, payload)

return _id, resp["data"]

async def update_resource(self, _id: str, resource: Dict) -> Tuple[str, Dict]:
destination_client = self.config.destination_client
payload = {"data": resource}
resp = await destination_client.put(
self.resource_config.base_path + f"/{self.config.state.destination[self.resource_type][_id]['id']}",
payload,
)

self.config.state.destination[self.resource_type][_id] = resp["data"]
return _id, resp["data"]

async def delete_resource(self, _id: str) -> None:
destination_client = self.config.destination_client
await destination_client.delete(
self.resource_config.base_path + f"/{self.config.state.destination[self.resource_type][_id]['id']}"
)

def connect_id(self, key: str, r_obj: Dict, resource_to_connect: str) -> Optional[List[str]]:
pass
115 changes: 115 additions & 0 deletions datadog_sync/model/logs_archives_order.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
# Unless explicitly stated otherwise all files in this repository are licensed
# under the 3-clause BSD style license (see LICENSE).
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2019 Datadog, Inc.

from __future__ import annotations
from typing import TYPE_CHECKING, Optional, List, Dict, Tuple
from copy import deepcopy

from deepdiff.operator import BaseOperator

from datadog_sync.utils.base_resource import BaseResource, ResourceConfig

if TYPE_CHECKING:
from datadog_sync.utils.custom_client import CustomClient


class LogsArchivesOrderIdsComparator(BaseOperator):
def match(self, level):
if "archive_ids" in level.t1 and "archive_ids" in level.t2:
# make copy so we do not mutate the original
level.t1 = deepcopy(level.t1)
level.t2 = deepcopy(level.t2)

# If we are at the top level, modify the list to exclude extra archives in destination.
t1 = set(level.t1["archive_ids"])
t2 = set(level.t2["archive_ids"])
d_ignore = t1 - t2

level.t1["archive_ids"] = [_id for _id in level.t1["archive_ids"] if _id not in d_ignore]
return True

def give_up_diffing(self, level, diff_instance) -> bool:
return False


class LogsArchivesOrder(BaseResource):
resource_type = "logs_archives_order"
resource_config = ResourceConfig(
concurrent=False,
base_path="/api/v2/logs/config/archive-order",
resource_connections={
"logs_archives": ["data.attributes.archive_ids"],
},
deep_diff_config={
"ignore_order": False,
"custom_operators": [LogsArchivesOrderIdsComparator()],
},
)
# Additional LogsArchivesOrder specific attributes
destination_archives_order: Dict[str, Dict] = dict()
default_id: str = "logs-archives-order"

async def get_resources(self, client: CustomClient) -> List[Dict]:
resp = await client.get(self.resource_config.base_path)

return [resp]

async def import_resource(self, _id: Optional[str] = None, resource: Optional[Dict] = None) -> Tuple[str, Dict]:
if _id:
source_client = self.config.source_client
resource = await source_client.get(self.resource_config.base_path)

return self.default_id, resource

async def pre_resource_action_hook(self, _id, resource: Dict) -> None:
self.destination_archives_order = await self.get_destination_archives_order()

async def pre_apply_hook(self) -> None:
pass

async def create_resource(self, _id: str, resource: Dict) -> Tuple[str, Dict]:
if not self.destination_archives_order:
raise Exception("Failed to retrieve destination orgs logs archive order")
skarimo marked this conversation as resolved.
Show resolved Hide resolved

self.config.state.destination[self.resource_type][_id] = self.destination_archives_order
return await self.update_resource(_id, resource)

async def update_resource(self, _id: str, resource: Dict) -> Tuple[str, Dict]:
destination_resources = (
self.destination_archives_order or self.config.state.destination[self.resource_type][_id]
)
ids_to_omit = set(resource["data"]["attributes"]["archive_ids"]) - set(
destination_resources["data"]["attributes"]["archive_ids"]
)

extra_ids_to_include = [
_id
for _id in destination_resources["data"]["attributes"]["archive_ids"]
if _id not in resource["data"]["attributes"]["archive_ids"]
]

resource["data"]["attributes"]["archive_ids"] = [
_id for _id in resource["data"]["attributes"]["archive_ids"] if _id not in ids_to_omit
]
resource["data"]["attributes"]["archive_ids"] = (
resource["data"]["attributes"]["archive_ids"] + extra_ids_to_include
)

destination_client = self.config.destination_client
resp = await destination_client.put(self.resource_config.base_path, resource)

return _id, resp

async def delete_resource(self, _id: str) -> None:
self.config.logger.warning("logs_archives_order cannot deleted. Removing resource from config only.")

def connect_id(self, key: str, r_obj: Dict, resource_to_connect: str) -> Optional[List[str]]:
return super(LogsArchivesOrder, self).connect_id(key, r_obj, resource_to_connect)

async def get_destination_archives_order(self):
destination_client = self.config.destination_client
resp = await self.get_resources(destination_client)

return resp[0]
2 changes: 2 additions & 0 deletions datadog_sync/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@
from datadog_sync.model.downtime_schedules import DowntimeSchedules
from datadog_sync.model.downtimes import Downtimes
from datadog_sync.model.host_tags import HostTags
from datadog_sync.model.logs_archives_order import LogsArchivesOrder
from datadog_sync.model.logs_archives import LogsArchives
from datadog_sync.model.logs_custom_pipelines import LogsCustomPipelines
from datadog_sync.model.logs_indexes import LogsIndexes
from datadog_sync.model.logs_indexes_order import LogsIndexesOrder
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2024-09-04T13:11:22.480442-04:00
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2024-09-04T13:11:22.498371-04:00
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
interactions:
- request:
body: null
headers:
Content-Type:
- application/json
method: DELETE
uri: https://api.datadoghq.eu/api/v2/logs/config/archives/Zulo48JVQXSyM97uKjGHpA
response:
body:
string: ''
headers: {}
status:
code: 204
message: No Content
- request:
body: null
headers:
Content-Type:
- application/json
method: DELETE
uri: https://api.datadoghq.eu/api/v2/logs/config/archives/3kkgEN2qQ7i_qfP0yFZx5g
response:
body:
string: ''
headers: {}
status:
code: 204
message: No Content
version: 1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2024-09-04T13:11:21.025343-04:00
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
interactions:
- request:
body: null
headers:
Content-Type:
- application/json
method: GET
uri: https://api.datadoghq.com/api/v2/logs/config/archives
response:
body:
string: '{"data": [{"type": "archives", "id": "V49TnL93R0C3QADZQllO5Q", "attributes":
{"name": "my first s3 archive", "query": "service:tutu", "state": "UNKNOWN",
"destination": {"bucket": "my-bucket", "path": "/path/foo", "type": "s3",
"integration": {"role_name": "testacc-datadog-integration-role", "account_id":
"123456789112"}}, "rehydration_tags": ["team:intake", "team:app"], "include_tags":
true, "rehydration_max_scan_size_in_gb": 123}}, {"type": "archives", "id":
"RK1PeXaNRButwKNMn_dRJQ", "attributes": {"name": "my first azure archive",
"query": "service:toto", "state": "UNKNOWN", "destination": {"container":
"my-container", "storage_account": "storageaccount", "path": "/path/blou",
"type": "azure", "integration": {"tenant_id": "92f7df25-f9d7-4e76-a3b6-4011e64958ea",
"client_id": "a75fbdd2-ade6-43d0-a810-4d886c53871e"}}, "rehydration_tags":
[], "include_tags": false, "rehydration_max_scan_size_in_gb": null}}]}'
headers:
Content-Type:
- application/json
status:
code: 200
message: OK
version: 1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2024-09-04T13:11:21.284161-04:00
Loading
Loading