Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(splunk): support splunk hec logging plugin #5819

Merged
merged 15 commits into from
Dec 20, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ A/B testing, canary release, blue-green deployment, limit rate, defense against
- High performance: The single-core QPS reaches 18k with an average delay of fewer than 0.2 milliseconds.
- [Fault Injection](docs/en/latest/plugins/fault-injection.md)
- [REST Admin API](docs/en/latest/admin-api.md): Using the REST Admin API to control Apache APISIX, which only allows 127.0.0.1 access by default, you can modify the `allow_admin` field in `conf/config.yaml` to specify a list of IPs that are allowed to call the Admin API. Also, note that the Admin API uses key auth to verify the identity of the caller. **The `admin_key` field in `conf/config.yaml` needs to be modified before deployment to ensure security**.
- External Loggers: Export access logs to external log management tools. ([HTTP Logger](docs/en/latest/plugins/http-logger.md), [TCP Logger](docs/en/latest/plugins/tcp-logger.md), [Kafka Logger](docs/en/latest/plugins/kafka-logger.md), [UDP Logger](docs/en/latest/plugins/udp-logger.md), [RocketMQ Logger](docs/en/latest/plugins/rocketmq-logger.md), [SkyWalking Logger](docs/en/latest/plugins/skywalking-logger.md), [Alibaba Cloud Logging(SLS)](docs/en/latest/plugins/sls-logger.md), [Google Cloud Logging](docs/en/latest/plugins/google-cloud-logging.md))
- External Loggers: Export access logs to external log management tools. ([HTTP Logger](docs/en/latest/plugins/http-logger.md), [TCP Logger](docs/en/latest/plugins/tcp-logger.md), [Kafka Logger](docs/en/latest/plugins/kafka-logger.md), [UDP Logger](docs/en/latest/plugins/udp-logger.md), [RocketMQ Logger](docs/en/latest/plugins/rocketmq-logger.md), [SkyWalking Logger](docs/en/latest/plugins/skywalking-logger.md), [Alibaba Cloud Logging(SLS)](docs/en/latest/plugins/sls-logger.md), [Google Cloud Logging](docs/en/latest/plugins/google-cloud-logging.md), [Splunk HEC Logging](docs/en/latest/plugins/splunk-hec-logging.md))
- [Datadog](docs/en/latest/plugins/datadog.md): push custom metrics to the DogStatsD server, comes bundled with [Datadog agent](https://docs.datadoghq.com/agent/), over the UDP protocol. DogStatsD basically is an implementation of StatsD protocol which collects the custom metrics for Apache APISIX agent, aggregates it into a single data point and sends it to the configured Datadog server.
- [Helm charts](https://github.com/apache/apisix-helm-chart)

Expand Down
150 changes: 150 additions & 0 deletions apisix/plugins/splunk-hec-logging.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
--
-- Licensed to the Apache Software Foundation (ASF) under one or more
-- contributor license agreements. See the NOTICE file distributed with
-- this work for additional information regarding copyright ownership.
-- The ASF licenses this file to You under the Apache License, Version 2.0
-- (the "License"); you may not use this file except in compliance with
-- the License. You may obtain a copy of the License at
--
-- http://www.apache.org/licenses/LICENSE-2.0
--
-- Unless required by applicable law or agreed to in writing, software
-- distributed under the License is distributed on an "AS IS" BASIS,
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-- See the License for the specific language governing permissions and
-- limitations under the License.
--

local core = require("apisix.core")
local ngx = ngx
local ngx_now = ngx.now
local http = require("resty.http")
local log_util = require("apisix.utils.log-util")
local bp_manager_mod = require("apisix.utils.batch-processor-manager")


local DEFAULT_SPLUNK_HEC_ENTRY_SOURCE = "apache-apisix-splunk-hec-logging"
local DEFAULT_SPLUNK_HEC_ENTRY_TYPE = "_json"


local plugin_name = "splunk-hec-logging"
local batch_processor_manager = bp_manager_mod.new(plugin_name)


local schema = {
type = "object",
properties = {
endpoint = {
type = "object",
properties = {
uri = core.schema.uri_def,
token = {
type = "string",
},
channel = {
type = "string",
},
timeout = {
type = "integer",
minimum = 1,
default = 10
}
},
required = { "uri", "token" }
},
ssl_verify = {
type = "boolean",
default = true
},
},
required = { "endpoint" },
}


local _M = {
version = 0.1,
priority = 409,
name = plugin_name,
schema = batch_processor_manager:wrap_schema(schema),
}


function _M.check_schema(conf)
return core.schema.check(schema, conf)
end


local function get_logger_entry(conf)
local entry = log_util.get_full_log(ngx, conf)
return {
time = ngx_now(),
host = entry.server.hostname,
source = DEFAULT_SPLUNK_HEC_ENTRY_SOURCE,
sourcetype = DEFAULT_SPLUNK_HEC_ENTRY_TYPE,
event = {
request_url = entry.request.url,
request_method = entry.request.method,
request_headers = entry.request.headers,
request_query = entry.request.querystring,
request_size = entry.request.size,
response_headers = entry.response.headers,
response_status = entry.response.status,
response_size = entry.response.size,
latency = entry.latency,
upstream = entry.upstream,
}
}
end


local function send_to_splunk(conf, entries)
local request_headers = {}
request_headers["Content-Type"] = "application/json"
request_headers["Authorization"] = "Splunk " .. conf.endpoint.token
if conf.endpoint.channel then
request_headers["X-Splunk-Request-Channel"] = conf.endpoint.channel
end

local http_new = http.new()
http_new:set_timeout(conf.endpoint.timeout * 1000)
local res, err = http_new:request_uri(conf.endpoint.uri, {
ssl_verify = conf.ssl_verify,
method = "POST",
body = core.json.encode(entries),
headers = request_headers,
})

if err then
return false, "failed to write log to splunk, " .. err
end

if res.status ~= 200 then
shuaijinchao marked this conversation as resolved.
Show resolved Hide resolved
local body
body, err = core.json.decode(res.body)
if err then
return false, "failed to send splunk, http status code: " .. res.status
else
return false, "failed to send splunk, " .. body.text
end
end

return true
end


function _M.log(conf, ctx)
local entry = get_logger_entry(conf)

if batch_processor_manager:add_entry(conf, entry) then
return
end

local process = function(entries)
return send_to_splunk(conf, entries)
end

batch_processor_manager:add_entry_to_new_processor(conf, entry, ctx, process)
end


return _M
13 changes: 13 additions & 0 deletions ci/pod/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -413,6 +413,19 @@ services:
networks:
opa_net:

# Splunk HEC Logging Service
splunk:
image: splunk/splunk:8.2.3
restart: unless-stopped
ports:
- "18088:8088"
environment:
SPLUNK_PASSWORD: "ApacheAPISIX@666"
SPLUNK_START_ARGS: "--accept-license"
SPLUNK_HEC_TOKEN: "BD274822-96AA-4DA6-90EC-18940FB2414C"
SPLUNK_HEC_SSL: "False"


networks:
apisix_net:
consul_net:
Expand Down
1 change: 1 addition & 0 deletions conf/config-default.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -366,6 +366,7 @@ plugins: # plugin list (sorted by priority)
- datadog # priority: 495
- echo # priority: 412
- http-logger # priority: 410
- splunk-hec-logging # priority: 409
- skywalking-logger # priority: 408
- google-cloud-logging # priority: 407
- sls-logger # priority: 406
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
3 changes: 2 additions & 1 deletion docs/en/latest/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,8 @@
"plugins/log-rotate",
"plugins/error-log-logger",
"plugins/sls-logger",
"plugins/google-cloud-logging"
"plugins/google-cloud-logging",
"plugins/splunk-hec-logging"
]
},
{
Expand Down
143 changes: 143 additions & 0 deletions docs/en/latest/plugins/splunk-hec-logging.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,143 @@
---
title: splunk-hec-logging
---

<!--
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
-->

## Summary

- [**Name**](#name)
- [**Attributes**](#attributes)
- [**How To Enable**](#how-to-enable)
- [**Test Plugin**](#test-plugin)
- [**Disable Plugin**](#disable-plugin)

## Name

The `splunk-hec-logging` plugin is used to forward the request log of `Apache APISIX` to `Splunk HTTP Event Collector (HEC)` for analysis and storage. After the plugin is enabled, `Apache APISIX` will obtain request context information in `Log Phase` serialize it into [Splunk Event Data format](https://docs.splunk.com/Documentation/Splunk/latest/Data/FormateventsforHTTPEventCollector#Event_metadata) and submit it to the batch queue. When the maximum processing capacity of each batch of the batch processing queue or the maximum time to refresh the buffer is triggered, the data in the queue will be submitted to `Splunk HEC`.

For more info on Batch-Processor in Apache APISIX please refer to:
[Batch-Processor](../batch-processor.md)

## Attributes

| Name | Requirement | Default | Description |
| ----------------------- | ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| endpoint | required | | Splunk HEC endpoint configuration info |
| endpoint.uri | required | | Splunk HEC event collector API |
| endpoint.token | required | | Splunk HEC authentication token |
| endpoint.channel | optional | | Splunk HEC send data channel identifier, refer to: [About HTTP Event Collector Indexer Acknowledgment](https://docs.splunk.com/Documentation/Splunk/8.2.3/Data/AboutHECIDXAck) |
| endpoint.timeout | optional | 10 | Splunk HEC send data timeout, time unit: (seconds) |
| ssl_verify | optional | true | enable `SSL` verification, option as per [OpenResty docs](https://github.com/openresty/lua-nginx-module#tcpsocksslhandshake) |
| max_retry_count | optional | 0 | max number of retries before removing from the processing pipe line |
| retry_delay | optional | 1 | number of seconds the process execution should be delayed if the execution fails |
| buffer_duration | optional | 60 | max age in seconds of the oldest entry in a batch before the batch must be processed |
| inactive_timeout | optional | 5 | max age in seconds when the buffer will be flushed if inactive |
| batch_max_size | optional | 1000 | max size of each batch |

## How To Enable

The following is an example of how to enable the `splunk-hec-logging` for a specific route.

### Full configuration

```shell
curl http://127.0.0.1:9080/apisix/admin/routes/1 -H 'X-API-KEY: edd1c9f034335f136f87ad84b625c8f1' -X PUT -d '
{
"plugins":{
"splunk-hec-logging":{
"endpoint":{
"uri":"http://127.0.0.1:8088/services/collector",
"token":"BD274822-96AA-4DA6-90EC-18940FB2414C",
"channel":"FE0ECFAD-13D5-401B-847D-77833BD77131",
"timeout":60
},
"buffer_duration":60,
"max_retry_count":0,
"retry_delay":1,
"inactive_timeout":2,
"batch_max_size":10
}
},
"upstream":{
"type":"roundrobin",
"nodes":{
"127.0.0.1:1980":1
}
},
"uri":"/splunk.do"
}'
```

### Minimize configuration

```shell
curl http://127.0.0.1:9080/apisix/admin/routes/1 -H 'X-API-KEY: edd1c9f034335f136f87ad84b625c8f1' -X PUT -d '
{
"plugins":{
"splunk-hec-logging":{
"endpoint":{
"uri":"http://127.0.0.1:8088/services/collector",
"token":"BD274822-96AA-4DA6-90EC-18940FB2414C"
}
}
},
"upstream":{
"type":"roundrobin",
"nodes":{
"127.0.0.1:1980":1
}
},
"uri":"/splunk.do"
}'
```

## Test Plugin

* Send request to route configured with the `splunk-hec-logging` plugin

```shell
$ curl -i http://127.0.0.1:9080/splunk.do?q=hello
HTTP/1.1 200 OK
...
hello, world
```

* Login to Splunk Dashboard to search and view

![splunk hec search view](../../../assets/images/plugin/splunk-hec-admin-en.png)

## Disable Plugin

Disabling the `splunk-hec-logging` plugin is very simple, just remove the `JSON` configuration corresponding to `splunk-hec-logging`.

```shell
$ curl http://127.0.0.1:9080/apisix/admin/routes/1 -H 'X-API-KEY: edd1c9f034335f136f87ad84b625c8f1' -X PUT -d '
{
"uri": "/hello",
"plugins": {},
"upstream": {
"type": "roundrobin",
"nodes": {
"127.0.0.1:1980": 1
}
}
}'
```
2 changes: 1 addition & 1 deletion docs/zh/latest/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ A/B 测试、金丝雀发布(灰度发布)、蓝绿部署、限流限速、抵
- 高性能:在单核上 QPS 可以达到 18k,同时延迟只有 0.2 毫秒。
- [故障注入](plugins/fault-injection.md)
- [REST Admin API](admin-api.md): 使用 REST Admin API 来控制 Apache APISIX,默认只允许 127.0.0.1 访问,你可以修改 `conf/config.yaml` 中的 `allow_admin` 字段,指定允许调用 Admin API 的 IP 列表。同时需要注意的是,Admin API 使用 key auth 来校验调用者身份,**在部署前需要修改 `conf/config.yaml` 中的 `admin_key` 字段,来保证安全。**
- 外部日志记录器:将访问日志导出到外部日志管理工具。([HTTP Logger](plugins/http-logger.md), [TCP Logger](plugins/tcp-logger.md), [Kafka Logger](plugins/kafka-logger.md), [UDP Logger](plugins/udp-logger.md), [RocketMQ Logger](plugins/rocketmq-logger.md), [SkyWalking Logger](plugins/skywalking-logger.md), [Alibaba Cloud Logging(SLS)](plugins/sls-logger.md), [Google Cloud Logging](plugins/google-cloud-logging.md))
- 外部日志记录器:将访问日志导出到外部日志管理工具。[HTTP Logger](plugins/http-logger.md)[TCP Logger](plugins/tcp-logger.md)[Kafka Logger](plugins/kafka-logger.md)[UDP Logger](plugins/udp-logger.md)[RocketMQ Logger](plugins/rocketmq-logger.md)[SkyWalking Logger](plugins/skywalking-logger.md)[Alibaba Cloud Logging(SLS)](plugins/sls-logger.md)[Google Cloud Logging](plugins/google-cloud-logging.md)、[Splunk HEC Logging](plugins/splunk-hec-logging.md))
- [Helm charts](https://github.com/apache/apisix-helm-chart)

- **高度可扩展**
Expand Down
3 changes: 2 additions & 1 deletion docs/zh/latest/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,8 @@
"plugins/log-rotate",
"plugins/error-log-logger",
"plugins/sls-logger",
"plugins/google-cloud-logging"
"plugins/google-cloud-logging",
"plugins/splunk-hec-logging"
]
},
{
Expand Down
Loading