Samples of Host Metrics Collection with Vector
Show metrics collected from hosts via Vector. To get started with Vector metrics collection proceed to the following:
- Install Vector in your host via cURL
- Create a Vector TOML config file
host_metrics.toml
- Execute the collection with
$ vector --config host_metrics.toml
The current TOML consists of a Vector source
and sink
only. To expand transformation capabilities see Vector Metrics
file:
host_metrics.toml
[sources.my_source_id]
type = "host_metrics"
collectors = [ "cpu","memory","disk","network","filesystem" ]
scrape_interval_secs = 15
[sinks.my_sink_id]
type = "datadog_metrics"
inputs = [ "my_source_id" ]
api_key = "${DATADOG_API_KEY_ENV_VAR}"
default_namespace = "service"
- Download the JSON file Vector - Host Metrics
- Import the JSON file into Datadog via the new dashboard process and accept the overwrite changes.
- Collection Syslog data to Vector
- Use scripts or commands to extract data from remote hosts
- Commands must be sent in a specific order as an array
file:
remote_poll.toml
# Source is the collector of data
[sources.my_source_id]
type = "exec"
mode = "scheduled"
command = ["ssh","-i", ".vagrant/machines/default/virtualbox/private_key", "vagrant@192.168.86.239", "-o", "StrictHostKeyChecking=no", "-q", "iperf3", "-c", "hera", "-Z", "-J", "-T", "Standard"]
include_stderr = true
maximum_buffer_size_bytes = 1_000_000
scheduled.exec_interval_secs = 37
# Sink out prints it to the console
[sinks.out]
inputs = ["my_source_id"]
type = "console"
encoding.codec = "text"
# Collect data and send it to Datadog as logs
[sinks.my_sink_id]
type = "datadog_logs"
inputs = [ "my_source_id" ]
default_api_key = "${DATADOG_API_KEY_ENV_VAR}"
region = "us"
compression = "gzip"
site = "datadoghq.com"
tags = ["source:vector","env:dev","collector:live process"]
Execute Vector to collect remote host data
vector --config remote_poll.toml