Skip to content

Commit

Permalink
Rename s3 input to aws-s3 input (#23469)
Browse files Browse the repository at this point in the history
  • Loading branch information
kaiyan-sheng authored Jan 19, 2021
1 parent 2f50f9e commit 616266f
Show file tree
Hide file tree
Showing 36 changed files with 64 additions and 63 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.next.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,7 @@ https://github.com/elastic/beats/compare/v7.0.0-alpha2...master[Check the HEAD d
- Rename bad ECS field name tracing.trace.id to trace.id in aws elb fileset. {pull}22571[22571]
- Fix parsing issues with nested JSON payloads in Elasticsearch audit log fileset. {pull}22975[22975]
- Rename `network.direction` values in crowdstrike/falcon to `ingress`/`egress`. {pull}23041[23041]
- Rename `s3` input to `aws-s3` input. {pull}23469[23469]

*Heartbeat*
- Adds negative body match. {pull}20728[20728]
Expand Down
8 changes: 4 additions & 4 deletions filebeat/docs/aws-credentials-examples.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
[source,yaml]
----
filebeat.inputs:
- type: s3
- type: aws-s3
queue_url: https://sqs.us-east-1.amazonaws.com/123/test-queue
access_key_id: '<access_key_id>'
secret_access_key: '<secret_access_key>'
Expand All @@ -15,7 +15,7 @@ or
[source,yaml]
----
filebeat.inputs:
- type: s3
- type: aws-s3
queue_url: https://sqs.us-east-1.amazonaws.com/123/test-queue
access_key_id: '${AWS_ACCESS_KEY_ID:""}'
secret_access_key: '${AWS_SECRET_ACCESS_KEY:""}'
Expand All @@ -27,7 +27,7 @@ filebeat.inputs:
[source,yaml]
----
filebeat.inputs:
- type: s3
- type: aws-s3
queue_url: https://sqs.us-east-1.amazonaws.com/123/test-queue
role_arn: arn:aws:iam::123456789012:role/test-mb
----
Expand All @@ -37,7 +37,7 @@ filebeat.inputs:
[source,yaml]
----
filebeat.inputs:
- type: s3
- type: aws-s3
queue_url: https://sqs.us-east-1.amazonaws.com/123/test-queue
credential_profile_name: test-fb
----
2 changes: 1 addition & 1 deletion filebeat/docs/filebeat-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ subdirectories of a directory.
You can configure {beatname_uc} to use the following inputs:

* <<{beatname_lc}-input-aws-cloudwatch>>
* <<{beatname_lc}-input-aws-s3>>
* <<{beatname_lc}-input-azure-eventhub>>
* <<{beatname_lc}-input-cloudfoundry>>
* <<{beatname_lc}-input-container>>
Expand All @@ -76,7 +77,6 @@ You can configure {beatname_uc} to use the following inputs:
* <<{beatname_lc}-input-netflow>>
* <<{beatname_lc}-input-o365audit>>
* <<{beatname_lc}-input-redis>>
* <<{beatname_lc}-input-s3>>
* <<{beatname_lc}-input-stdin>>
* <<{beatname_lc}-input-syslog>>
* <<{beatname_lc}-input-tcp>>
Expand Down
4 changes: 2 additions & 2 deletions filebeat/docs/modules/cisco.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -388,7 +388,7 @@ will be found under `rsa.raw`. The default is false.

The Cisco Umbrella fileset primarily focuses on reading CSV files from an S3 bucket using the filebeat S3 input.

To configure Cisco Umbrella to log to a self-managed S3 bucket please follow the https://docs.umbrella.com/deployment-umbrella/docs/log-management[Cisco Umbrella User Guide], and the link:filebeat-input-s3.html[S3 input documentation] to setup the necessary Amazon SQS queue. Retrieving logs from a Cisco-managed S3 bucket is not currently supported.
To configure Cisco Umbrella to log to a self-managed S3 bucket please follow the https://docs.umbrella.com/deployment-umbrella/docs/log-management[Cisco Umbrella User Guide], and the link:filebeat-input-aws-s3.html[AWS S3 input documentation] to setup the necessary Amazon SQS queue. Retrieving logs from a Cisco-managed S3 bucket is not currently supported.

This fileset supports all 4 log types:
- Proxy
Expand All @@ -409,7 +409,7 @@ Example config:
- module: cisco
umbrella:
enabled: true
var.input: s3
var.input: aws-s3
var.queue_url: https://sqs.us-east-1.amazonaws.com/ID/CiscoQueue
var.access_key_id: 123456
var.secret_access_key: PASSWORD
Expand Down
2 changes: 1 addition & 1 deletion x-pack/elastic-agent/pkg/agent/program/supported.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions x-pack/elastic-agent/spec/filebeat.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,8 @@ rules:
selector: inputs
key: type
values:
- awscloudwatch
- aws-cloudwatch
- aws-s3
- azure-eventhub
- cloudfoundry
- container
Expand All @@ -73,7 +74,6 @@ rules:
- netflow
- o365audit
- redis
- s3
- stdin
- syslog
- tcp
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,21 +54,21 @@
# Path to a JSON file containing the credentials and key used to subscribe.
credentials_file: ${path.config}/my-pubsub-subscriber-credentials.json

#------------------------------ S3 input --------------------------------
#------------------------------ AWS S3 input --------------------------------
# Beta: Config options for AWS S3 input
#- type: s3
#- type: aws-s3
#enabled: false

# AWS Credentials
# If access_key_id and secret_access_key are configured, then use them to make api calls.
# If not, s3 input will load default AWS config or load with given profile name.
# If not, aws-s3 input will load default AWS config or load with given profile name.
#access_key_id: '${AWS_ACCESS_KEY_ID:""}'
#secret_access_key: '${AWS_SECRET_ACCESS_KEY:""}'
#session_token: '${AWS_SESSION_TOKEN:"”}'
#credential_profile_name: test-s3-input
#credential_profile_name: test-aws-s3-input

# Queue url (required) to receive queue messages from
#queue_url: "https://sqs.us-east-1.amazonaws.com/1234/test-s3-logs-queue"
#queue_url: "https://sqs.us-east-1.amazonaws.com/1234/test-aws-s3-logs-queue"

# The duration (in seconds) that the received messages are hidden from subsequent
# retrieve requests after being retrieved by a ReceiveMessage request.
Expand Down
18 changes: 9 additions & 9 deletions x-pack/filebeat/docs/inputs/input-aws-s3.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,18 @@

:libbeat-xpack-dir: ../../../../x-pack/libbeat

:type: s3
:type: aws-s3

[id="{beatname_lc}-input-{type}"]
=== S3 input
=== AWS S3 input

++++
<titleabbrev>S3</titleabbrev>
<titleabbrev>AWS S3</titleabbrev>
++++

beta[]

Use the `s3` input to retrieve logs from S3 objects that are pointed by messages
Use the `aws-s3` input to retrieve logs from S3 objects that are pointed by messages
from specific SQS queues. This input can, for example, be used to receive S3
server access logs to monitor detailed records for the requests that are made to
a bucket.
Expand All @@ -28,13 +28,13 @@ stopped and the sqs message will be returned back to the queue.
["source","yaml",subs="attributes"]
----
{beatname_lc}.inputs:
- type: s3
- type: aws-s3
queue_url: https://sqs.ap-southeast-1.amazonaws.com/1234/test-s3-queue
credential_profile_name: elastic-beats
expand_event_list_from_field: Records
----

The `s3` input supports the following configuration options plus the
The `aws-s3` input supports the following configuration options plus the
<<{beatname_lc}-input-{type}-common-options>> described later.

[float]
Expand Down Expand Up @@ -74,7 +74,7 @@ can be assigned the name of the field. This setting will be able to split the
messages under the group value into separate events. For example, CloudTrail logs
are in JSON format and events are found under the JSON object "Records".

Note: When `expand_event_list_from_field` parameter is given in the config, s3
Note: When `expand_event_list_from_field` parameter is given in the config, aws-s3
input will assume the logs are in JSON format and decode them as JSON. Content
type will not be checked.
If a file has "application/json" content-type, `expand_event_list_from_field`
Expand Down Expand Up @@ -132,7 +132,7 @@ is 0 seconds. The maximum is 12 hours.
[float]
==== `aws credentials`

In order to make AWS API calls, `s3` input requires AWS credentials.Please see
In order to make AWS API calls, `aws-s3` input requires AWS credentials.Please see
<<aws-credentials-config,AWS credentials options>> for more details.

[float]
Expand Down Expand Up @@ -170,7 +170,7 @@ During this time, Filebeat processes and deletes the message. However, if
Filebeat fails before deleting the message and your system doesn't call the
DeleteMessage action for that message before the visibility timeout expires, the
message becomes visible to other {beatname_uc} instances, and the message is
received again. By default, the visibility timeout is set to 5 minutes for s3
received again. By default, the visibility timeout is set to 5 minutes for aws-s3
input in {beatname_uc}. 5 minutes is sufficient time for {beatname_uc} to read
SQS messages and process related s3 log files.

Expand Down
12 changes: 6 additions & 6 deletions x-pack/filebeat/filebeat.reference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -604,7 +604,7 @@ filebeat.modules:
umbrella:
enabled: true

#var.input: s3
#var.input: aws-s3
# AWS SQS queue url
#var.queue_url: https://sqs.us-east-1.amazonaws.com/ID/CiscoQueue
# Access ID to authenticate with the S3 input
Expand Down Expand Up @@ -2711,21 +2711,21 @@ filebeat.inputs:
# Path to a JSON file containing the credentials and key used to subscribe.
credentials_file: ${path.config}/my-pubsub-subscriber-credentials.json

#------------------------------ S3 input --------------------------------
#------------------------------ AWS S3 input --------------------------------
# Beta: Config options for AWS S3 input
#- type: s3
#- type: aws-s3
#enabled: false

# AWS Credentials
# If access_key_id and secret_access_key are configured, then use them to make api calls.
# If not, s3 input will load default AWS config or load with given profile name.
# If not, aws-s3 input will load default AWS config or load with given profile name.
#access_key_id: '${AWS_ACCESS_KEY_ID:""}'
#secret_access_key: '${AWS_SECRET_ACCESS_KEY:""}'
#session_token: '${AWS_SESSION_TOKEN:"”}'
#credential_profile_name: test-s3-input
#credential_profile_name: test-aws-s3-input

# Queue url (required) to receive queue messages from
#queue_url: "https://sqs.us-east-1.amazonaws.com/1234/test-s3-logs-queue"
#queue_url: "https://sqs.us-east-1.amazonaws.com/1234/test-aws-s3-logs-queue"

# The duration (in seconds) that the received messages are hidden from subsequent
# retrieve requests after being retrieved by a ReceiveMessage request.
Expand Down
2 changes: 1 addition & 1 deletion x-pack/filebeat/include/list.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
// or more contributor license agreements. Licensed under the Elastic License;
// you may not use this file except in compliance with the Elastic License.

package s3
package awss3

import (
"bufio"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
// or more contributor license agreements. Licensed under the Elastic License;
// you may not use this file except in compliance with the Elastic License.

package s3
package awss3

import (
"bufio"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
// or more contributor license agreements. Licensed under the Elastic License;
// you may not use this file except in compliance with the Elastic License.

package s3
package awss3

import (
"fmt"
Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

File renamed without changes.
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
// or more contributor license agreements. Licensed under the Elastic License;
// you may not use this file except in compliance with the Elastic License.

package s3
package awss3

import (
"context"
Expand All @@ -20,7 +20,7 @@ import (
"github.com/elastic/go-concert/ctxtool"
)

const inputName = "s3"
const inputName = "aws-s3"

func Plugin() v2.Plugin {
return v2.Plugin{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
// +build integration
// +build aws

package s3
package awss3

import (
"context"
Expand Down
Loading

0 comments on commit 616266f

Please sign in to comment.