Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SSH Analysis Example #110

Merged
merged 16 commits into from
Jan 9, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions Security_Analytics/ssh_analysis/.kitchen/logs/kitchen.log
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
E, [2017-01-05T10:40:01.429642 #45503] ERROR -- Kitchen: ------Exception-------
E, [2017-01-05T10:40:01.429751 #45503] ERROR -- Kitchen: Class: Kitchen::UserError
E, [2017-01-05T10:40:01.429789 #45503] ERROR -- Kitchen: Message: Kitchen YAML file /opt/demos/examples/Security_Analytics/ssh_analysis/.kitchen.yml does not exist.
E, [2017-01-05T10:40:01.429824 #45503] ERROR -- Kitchen: ------Backtrace-------
E, [2017-01-05T10:40:01.429841 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/loader/yaml.rb:74:in `read'
E, [2017-01-05T10:40:01.429854 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/config.rb:149:in `data'
E, [2017-01-05T10:40:01.429865 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/config.rb:128:in `suites'
E, [2017-01-05T10:40:01.429875 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/config.rb:179:in `filter_instances'
E, [2017-01-05T10:40:01.429886 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/config.rb:138:in `build_instances'
E, [2017-01-05T10:40:01.429896 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/config.rb:114:in `instances'
E, [2017-01-05T10:40:01.429907 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/command.rb:115:in `filtered_instances'
E, [2017-01-05T10:40:01.429956 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/command.rb:145:in `parse_subcommand'
E, [2017-01-05T10:40:01.429966 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/command/list.rb:32:in `call'
E, [2017-01-05T10:40:01.429976 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/cli.rb:56:in `perform'
E, [2017-01-05T10:40:01.429986 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/cli.rb:116:in `list'
E, [2017-01-05T10:40:01.429996 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/thor-0.19.1/lib/thor/command.rb:27:in `run'
E, [2017-01-05T10:40:01.430006 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/thor-0.19.1/lib/thor/invocation.rb:126:in `invoke_command'
E, [2017-01-05T10:40:01.430055 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/cli.rb:321:in `invoke_task'
E, [2017-01-05T10:40:01.430099 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/thor-0.19.1/lib/thor.rb:359:in `dispatch'
E, [2017-01-05T10:40:01.430120 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/thor-0.19.1/lib/thor/base.rb:440:in `start'
E, [2017-01-05T10:40:01.430130 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/bin/kitchen:13:in `block in <top (required)>'
E, [2017-01-05T10:40:01.430140 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/lib/kitchen/errors.rb:154:in `with_friendly_errors'
E, [2017-01-05T10:40:01.430150 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/gems/test-kitchen-1.8.0/bin/kitchen:13:in `<top (required)>'
E, [2017-01-05T10:40:01.430174 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/bin/kitchen:23:in `load'
E, [2017-01-05T10:40:01.430186 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/bin/kitchen:23:in `<main>'
E, [2017-01-05T10:40:01.430195 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/bin/ruby_executable_hooks:15:in `eval'
E, [2017-01-05T10:40:01.430205 #45503] ERROR -- Kitchen: /Users/dale/.rvm/gems/ruby-2.3.0/bin/ruby_executable_hooks:15:in `<main>'
E, [2017-01-05T10:40:01.430215 #45503] ERROR -- Kitchen: ----------------------
156 changes: 156 additions & 0 deletions Security_Analytics/ssh_analysis/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,156 @@
# Alerting on SSH CEF Data

This **Getting Started with Elastic Stack** example provides sample files to ingest, analyze and alert on **SSH Logs in the CEF Format** using the Elastic Stack.

Included are example Watches for proactively monitoring this data for possible security incidents. These examples support the Security Analytics blog post series, specifically [Integrating Elasticsearch with ArcSight SIEM - Part 2]().
The first watch provides the means to detect successful logins from external IP Addresses.

This example includes:

- [`ssh.cef`](http://download.elasticsearch.org/demos/cef_ssh/ssh.cef) - Sample SSH logs in CEF format
- `ssh_analysis_logstash.conf` - An appropriate Logstash configuration for indexing the above CEF data
- `ssh_analysis_kibana.json` - Simple Kibana visualizations and dashboards for the associated blog posts
- `successful_login_external.json` - A watch detects remote logins from external IP addresses. REFERENCE ONLY.
- `successful_login_external.inline.json` - The above watch in an inline execution format so it can be used with the `run_watch.sh` script and be executed over the full dataset.
- `run_watch.sh` - A convenience script to execute a watch

This example depends on:

- [cef_template.json](https://github.com/elastic/examples/blob/master/Security_Analytics/cef_demo/logstash/pipeline/cef_template.json)

which will be installed when Logstash is run with the above configuration.

### Versions

Example has been tested with the following versions:

- Elasticsearch 5.1
- Logstash 5.1
- Kibana 5.1
- X-Pack 5.1

### Installation & Setup

* Follow the [Installation & Setup Guide](https://github.com/elastic/examples/blob/master/Installation%20and%20Setup.md) to install and test the Elastic Stack (*you can skip this step if you have a working installation of the Elastic Stack,*)


* Install the X-Pack in Kibana and Elasticsearch

```shell
<path_to_elasticsearch_root_dir>/elasticsearch-plugin install x-pack
<path_to_kibana_root_dir>/bin/kibana-plugin install x-pack
```

* Run Elasticsearch & Kibana
```shell
<path_to_elasticsearch_root_dir>/bin/elasticsearch
<path_to_kibana_root_dir>/bin/kibana
```

* Check that Elasticsearch and Kibana are up and running.
- Open `localhost:9200` in web browser -- should return a json message indicating ES is running.
- Open `localhost:5601` in web browser -- should display Kibana UI.

**Note:** By default, Elasticsearch runs on port 9200, and Kibana run on ports 5601. If you changed the default ports, change the above calls to use appropriate ports.

### Download Example Files

Download the following files in this repo to a local directory:

- [`ssh.cef`](http://download.elasticsearch.org/demos/cef_ssh/ssh.cef)
- `ssh_analysis_logstash.conf`
- `ssh_analysis_kibana.json`
- `successful_login_external.json`
- `successful_login_external.inline.json`

Additionally, download the following template dependency into the same local directory:

- `cef_template.json` from [here](https://github.com/elastic/examples/blob/master/Security_Analytics/cef_demo/logstash/pipeline/cef_template.json)

Unfortunately, Github does not provide a convenient one-click option to download entire contents of a subfolder in a repo. Use sample code provided below to download the required files to a local directory:

```shell
mkdir ssh_analysis
cd ssh_analysis
wget https://raw.githubusercontent.com/elastic/examples/master/Security_Analytics/ssh_analysis/ssh_analysis_logstash.conf
wget https://raw.githubusercontent.com/elastic/examples/master/Security_Analytics/ssh_analysis/successful_login_external.json
wget https://raw.githubusercontent.com/elastic/examples/master/Security_Analytics/ssh_analysis/successful_login_external.inline.json
wget https://raw.githubusercontent.com/elastic/examples/master/Security_Analytics/ssh_analysis/ssh_analysis_kibana.json
wget http://download.elasticsearch.org/demos/cef_ssh/ssh.cef
wget https://raw.githubusercontent.com/elastic/examples/master/Security_Analytics/cef_demo/logstash/pipeline/cef_template.json
```

Using curl or don't have wget? Try replacing 'wget' with 'curl -O' for the above commands.

### Run Example

#### 1. Start Logstash with the appropriate configuration

```shell
<path_to_logstash_root_dir>/bin/logstash -f ssh_analysis_logstash.conf
```

Wait for Logstash to start, as indicated by the message "Successfully started Logstash API endpoint"


#### 2. Ingest data into Elasticsearch using Logstash

* Execute the following command to load sample logs into Elasticsearch in a separate terminal. [Note: It takes a few minutes to ingest the entire file (114,147 documents) into Elasticsearch]

```shell
cat ssh.cef | nc localhost 5000
```

Once indexing is complete this command will return.

* Verify that data is successfully indexed into Elasticsearch

Running `http://localhost:9200/cef-ssh-*/_count` should return a response a `"count":114147`

**Note:** Included `ssh_analysis_logstash.conf` configuration file assumes that you are running Elasticsearch on the same host as Logstash and have not changed the defaults. Modify the `host` and `cluster` settings in the `output { elasticsearch { ... } }` section of apache_logstash.conf, if needed. Furthermore, it assumes the default X-Pack security username/password of elastic/changeme - change as required.

#### 3. Execute A Watch

**The watch must be executed over the full dataset, rather than just the previous N minutes, as the data is historical.**
**The inline version of the watch removes the time restriction and allows this.**

To run a watch over the full dataset, either:

* Execute the following command from the `ssh_analysis` directory to execute a specific watch

```shell
./run_watch <name of watch> <username> <password>
```

`username` and `password` are both optional and default to 'elastic' and 'changeme' respectively.

e.g.

```shell
./run_watch successful_login_external.inline
```


OR MANUALLY

* Access Kibana by going to `http://localhost:5601` in a web browser
* Click the **Dev Tools** tab >> **Console** tab
* Use the [inline execution api](https://www.elastic.co/guide/en/x-pack/5.1/watcher-api-execute-watch.html#watcher-api-execute-inline-watch) to execute the watch, copying the contents for the watch key from the appropriate inline file e.g. `successful_login_external.inline.json`

#### 4. Visualize the results in Kibana

* Access Kibana by going to `http://localhost:5601` in a web browser
* Connect Kibana to the `cef-ssh-*` and `cef-ssh-watch-results` indices in Elasticsearch (autocreated in step 1)
* Click the **Management** tab >> **Index Patterns** tab >> **Create New**. Specify `cef-ssh-*` as the index pattern name and click **Create** to define the index pattern. (Leave the **Use event times to create index names** box unchecked and use @timestamp as the Time Field)
* Click the **Management** tab >> **Index Patterns** tab >> **Create New**. Specify `cef-ssh-watch-results` as the index pattern name and click **Create** to define the index pattern. (Leave the **Use event times to create index names** box unchecked and use @timestamp as the Time Field)
* Load sample dashboard into Kibana
* Click the **Management** tab >> **Saved Objects** tab >> **Import**, and select `ssh_analysis_kibana.json`
* Open dashboard
* Click on **Dashboard** tab and open `CEF Login Dashboard` dashboard

Voila! You should see the following dashboard.

![Kibana Dashboard Screenshot](https://cloud.githubusercontent.com/assets/12695796/21648199/0078db7e-d295-11e6-8a3c-357074a4e12a.png)

### We would love your feedback!
If you found this example helpful and would like more such Getting Started examples for other standard formats, we would love to hear from you. If you would like to contribute Getting Started examples to this repo, we'd love that too!
26 changes: 26 additions & 0 deletions Security_Analytics/ssh_analysis/run_watch.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
if [ -z "$1" ] ; then
echo "No watch name supplied e.g. ./load_watch.sh successful_login_external"
exit 1
fi

username=elastic
if [ "$2" ] ; then
username=$2
fi

password=changeme
if [ "$3" ] ; then
password=$3
fi

echo "Loading $1 watch "

curl -s -o /dev/null -X DELETE localhost:9200/_xpack/watcher/watch/$1 -u $username:$password
es_response=$(curl --w "%{http_code}" -s -o /dev/null -X POST localhost:9200/_xpack/watcher/watch/_execute -u $username:$password -d @$1.json)
if [ 0 -eq $? ] && [ $es_response = "200" ]; then
echo "Loading $2 watch...OK"
exit 0
else
echo "Loading $2 watch...FAILED"
exit 1
fi
68 changes: 68 additions & 0 deletions Security_Analytics/ssh_analysis/ssh_analysis_kibana.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
[
{
"_id": "CEF-Login-Dashboard",
"_type": "dashboard",
"_source": {
"title": "CEF Login Dashboard",
"hits": 0,
"description": "",
"panelsJSON": "[{\"id\":\"Top-Attempted-Passwords\",\"type\":\"visualization\",\"panelIndex\":1,\"size_x\":6,\"size_y\":4,\"col\":1,\"row\":1},{\"id\":\"Logins\",\"type\":\"visualization\",\"panelIndex\":2,\"size_x\":12,\"size_y\":4,\"col\":1,\"row\":5},{\"id\":\"Failed-vs-Successful-Logins\",\"type\":\"visualization\",\"panelIndex\":3,\"size_x\":6,\"size_y\":4,\"col\":7,\"row\":1}]",
"optionsJSON": "{\"darkTheme\":false}",
"uiStateJSON": "{\"P-2\":{\"vis\":{\"params\":{\"sort\":{\"columnIndex\":null,\"direction\":null}}}}}",
"version": 1,
"timeRestore": true,
"timeTo": "2016-12-08T16:45:03.557Z",
"timeFrom": "2016-11-14T06:30:21.343Z",
"refreshInterval": {
"display": "Off",
"pause": false,
"value": 0
},
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"filter\":[{\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true,\"default_field\":\"_all\"}}}]}"
}
}
},
{
"_id": "Failed-vs-Successful-Logins",
"_type": "visualization",
"_source": {
"title": "Failed vs Successful Logins",
"visState": "{\n \"type\": \"timelion\",\n \"title\": \"Failed vs Successful Logins\",\n \"params\": {\n \"expression\": \".es(q=categoryOutcome:success, timefield=@timestamp,index=cef-ssh-*).label(\\\"successful logins\\\").yaxis(2) .es(q=categoryOutcome:failure, timefield=@timestamp,index=cef-ssh-*).label(\\\"failed logins\\\")\",\n \"interval\": \"auto\"\n }\n}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{}"
}
}
},
{
"_id": "Logins",
"_type": "visualization",
"_source": {
"title": "Logins",
"visState": "{\n \"title\": \"Logins\",\n \"type\": \"table\",\n \"params\": {\n \"perPage\": 10,\n \"showPartialRows\": false,\n \"showMeticsAtAllLevels\": false,\n \"sort\": {\n \"columnIndex\": null,\n \"direction\": null\n },\n \"showTotal\": false,\n \"totalFunc\": \"sum\"\n },\n \"aggs\": [\n {\n \"id\": \"1\",\n \"enabled\": true,\n \"type\": \"max\",\n \"schema\": \"metric\",\n \"params\": {\n \"field\": \"@timestamp\",\n \"customLabel\": \"Latest Event Time\"\n }\n },\n {\n \"id\": \"2\",\n \"enabled\": true,\n \"type\": \"terms\",\n \"schema\": \"bucket\",\n \"params\": {\n \"field\": \"destinationHostName\",\n \"size\": 100,\n \"order\": \"desc\",\n \"orderBy\": \"1\",\n \"customLabel\": \"Target Server\"\n }\n },\n {\n \"id\": \"3\",\n \"enabled\": true,\n \"type\": \"terms\",\n \"schema\": \"bucket\",\n \"params\": {\n \"field\": \"sourceAddress\",\n \"size\": 100,\n \"order\": \"desc\",\n \"orderBy\": \"1\",\n \"customLabel\": \"Source IP Address\"\n }\n }\n ],\n \"listeners\": {}\n}",
"uiStateJSON": "{\n \"vis\": {\n \"params\": {\n \"sort\": {\n \"columnIndex\": null,\n \"direction\": null\n }\n }\n }\n}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"cef-ssh-watch-results\",\n \"query\": {\n \"query_string\": {\n \"query\": \"*\",\n \"analyze_wildcard\": true,\n \"default_field\": \"_all\"\n }\n },\n \"filter\": []\n}"
}
}
},
{
"_id": "Top-Attempted-Passwords",
"_type": "visualization",
"_source": {
"title": "Top Attempted Passwords",
"visState": "{\n \"title\": \"Top Attempted Passwords\",\n \"type\": \"histogram\",\n \"params\": {\n \"shareYAxis\": true,\n \"addTooltip\": true,\n \"addLegend\": true,\n \"legendPosition\": \"right\",\n \"scale\": \"linear\",\n \"mode\": \"stacked\",\n \"times\": [],\n \"addTimeMarker\": false,\n \"defaultYExtents\": false,\n \"setYExtents\": false,\n \"yAxis\": {}\n },\n \"aggs\": [\n {\n \"id\": \"1\",\n \"enabled\": true,\n \"type\": \"count\",\n \"schema\": \"metric\",\n \"params\": {}\n },\n {\n \"id\": \"2\",\n \"enabled\": true,\n \"type\": \"terms\",\n \"schema\": \"segment\",\n \"params\": {\n \"field\": \"deviceCustomString3\",\n \"exclude\": {\n \"pattern\": \"\"\n },\n \"size\": 10,\n \"order\": \"desc\",\n \"orderBy\": \"1\",\n \"customLabel\": \"Attempted Passwords\"\n }\n }\n ],\n \"listeners\": {}\n}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\n \"index\": \"cef-ssh-*\",\n \"query\": {\n \"query_string\": {\n \"query\": \"*\",\n \"analyze_wildcard\": true,\n \"default_field\": \"_all\"\n }\n },\n \"filter\": []\n}"
}
}
}
]
Original file line number Diff line number Diff line change
Expand Up @@ -10,41 +10,36 @@ input {

filter {
# To map the attacker Geo IP if plausible

geoip {
source => "sourceAddress"
target => "source"
}

# To map the target Geo IP if plausible

geoip {
source => "destinationAddress"
target => "destination"
}

# To map the log producing device Geo IP if plausible

geoip {
source => "deviceAddress"
target => "device"
}

#Map startTime to @timestamp field
date {
match => ["deviceReceiptTime","MMM dd YYY HH:mm:ss"]
}
match => ["startTime","MMM dd YYY HH:mm:ss"]
}
}

output {

elasticsearch {
template_name => "cef"
template => "/opt/demos/examples/Security_Analytics/cef_demo/logstash/pipeline/cef_template.json"
template_overwrite => true
index => "cef-%{+YYYY.MM.dd}"
password => "changeme"
user => "elastic"
}


elasticsearch {
template_name => "cef"
template => "./cef_template.json"
template_overwrite => true
index => "cef-ssh-%{+YYYY.MM.dd}"
password => "changeme"
user => "elastic"
}
}
Loading