Skip to content

Commit

Permalink
Update documentation and fix bugs after final testing
Browse files Browse the repository at this point in the history
  • Loading branch information
chopicalqui committed Jan 4, 2021
1 parent a8617b1 commit 808b65c
Show file tree
Hide file tree
Showing 9 changed files with 23 additions and 20 deletions.
11 changes: 6 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Turbo Data Miner

This extension adds a new tab `Turbo Miner` to Burp Suite's GUI as well as an new entry `Process in Turbo Data Miner
This extension adds a new tab `Turbo Miner` to Burp Suite's GUI as well as a new entry `Process in Turbo Data Miner
(Proxy History Analyzer tab)` to Burp Suite's context menu. In the new tab, you are able to write new or select
existing Python scripts that are executed on each request/response item currently stored in the Proxy History, Side
Map, or on each request/response item that is sent or received by Burp Suite.
Expand All @@ -12,7 +12,7 @@ the data collected and processed by Burp Suite.

The following screenshot provides an example how Turbo Data Miner can be used to obtain a structured presentation of all
cookies (and their attributes) that are stored in the current Burp Suite project. At the bottom (see 1), you select
the corresponding Python script in the combobox. By clicking button `Load Script`, the selected
the corresponding Python script in the combo box. By clicking button `Load Script`, the selected
code is then loaded into the IDE text area and can be customized, if needed. Alternatively, you can create your own
script by clicking button `New Script`. The analysis is started by clicking button `Start`. Afterwards, Turbo Data Miner
executes the compiled Python script on each Request/Response item. Thereby, the script extracts cookie information
Expand All @@ -22,10 +22,10 @@ column to gain a better understanding of each attribute or perform additional op

![Turbo Data Miner's Proxy History Analyzer](example.png)

As you can see, with Python skills, an understanding of the
As we can see, with Python skills, an understanding of the
[Burp Suite Extender API](https://portswigger.net/Burp/extender/api/index.html) as well as an understanding of Turbo
Miner's API (see Turbo Data Miner tab `About` or directly the
[HTML page](https://github.com/chopicalqui/TurboDataMiner/blob/master/turbominer/about.html) used by the `About` tab),
[HTML page](https://github.com/chopicalqui/TurboDataMiner/blob/master/turbodataminer/about.html) used by the `About` tab),
you can extract and structure any information available in the current Burp Suite project.


Expand Down Expand Up @@ -146,7 +146,8 @@ This tab contains the documentation about Turbo Intruder's Application Programmi

# Author

**Lukas Reiter** (@chopicalquy) - [Turbo Data Miner](https://github.com/chopicalqui/TurboDataMiner)
**Lukas Reiter** ([@chopicalquy](https://twitter.com/chopicalquy)) -
[Turbo Data Miner](https://github.com/chopicalqui/TurboDataMiner)

# License

Expand Down
10 changes: 5 additions & 5 deletions turbodataminer/about.html
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
<body>
<h1>Turbo Data Miner</h1>

This extension adds a new tab <code>Turbo Miner</code> to Burp Suite's GUI as well as an new entry
This extension adds a new tab <code>Turbo Miner</code> to Burp Suite's GUI as well as a new entry
<code>Process in Turbo Miner</code> to Burp Suite's context menu. In the new tab, you are able to write new or select
existing Python scripts that are executed on each request/response item currently stored in the Proxy History,
Side Map, or on each request/response item that is sent or received by Burp Suite.
Expand Down Expand Up @@ -762,15 +762,15 @@ <h2>Method Details</h2>

<dl><dt><a name="ExportedMethods-analyze_request"><strong>analyze_request</strong></a>(self, message_info)</dt><dd><tt>This&nbsp;method&nbsp;returns&nbsp;an&nbsp;IRequestInfo&nbsp;object&nbsp;based&nbsp;on&nbsp;the&nbsp;given&nbsp;IHttpRequestResponse&nbsp;object.<br>
<br>
:param&nbsp;message_info&nbsp;(IHttpRequestResponse):&nbsp;The&nbsp;IHttpRequestResponse&nbsp;whose&nbsp;request&nbsp;should&nbsp;be&nbsp;returned&nbsp;as&nbsp;a<br>
:param&nbsp;message_info&nbsp;(IHttpRequestResponse):&nbsp;The&nbsp;IHttpRequestResponse&nbsp;whose&nbsp;request&nbsp;should&nbsp;be&nbsp;returned&nbsp;as&nbsp;an<br>
IRequestInfo&nbsp;object.<br>
:return&nbsp;(IRequestInfo):&nbsp;A&nbsp;IRequestInfo&nbsp;object&nbsp;or&nbsp;None,&nbsp;if&nbsp;no&nbsp;request&nbsp;was&nbsp;found.</tt></dd></dl>
:return&nbsp;(IRequestInfo):&nbsp;An&nbsp;IRequestInfo&nbsp;object&nbsp;or&nbsp;None,&nbsp;if&nbsp;no&nbsp;request&nbsp;was&nbsp;found.</tt></dd></dl>

<dl><dt><a name="ExportedMethods-analyze_response"><strong>analyze_response</strong></a>(self, message_info)</dt><dd><tt>This&nbsp;method&nbsp;returns&nbsp;an&nbsp;IResponseInfo&nbsp;object&nbsp;based&nbsp;on&nbsp;the&nbsp;given&nbsp;IHttpRequestResponse&nbsp;object.<br>
<br>
:param&nbsp;message_info&nbsp;(IHttpRequestResponse):&nbsp;The&nbsp;IHttpRequestResponse&nbsp;whose&nbsp;request&nbsp;should&nbsp;be&nbsp;returned&nbsp;as&nbsp;a<br>
:param&nbsp;message_info&nbsp;(IHttpRequestResponse):&nbsp;The&nbsp;IHttpRequestResponse&nbsp;whose&nbsp;request&nbsp;should&nbsp;be&nbsp;returned&nbsp;as&nbsp;an<br>
IResponseInfo&nbsp;object.<br>
:return&nbsp;(IResponseInfo):&nbsp;A&nbsp;IResponseInfo&nbsp;object&nbsp;or&nbsp;None,&nbsp;if&nbsp;no&nbsp;response&nbsp;was&nbsp;found.</tt></dd></dl>
:return&nbsp;(IResponseInfo):&nbsp;An&nbsp;IResponseInfo&nbsp;object&nbsp;or&nbsp;None,&nbsp;if&nbsp;no&nbsp;response&nbsp;was&nbsp;found.</tt></dd></dl>

<dl><dt><a name="ExportedMethods-analyze_signatures"><strong>analyze_signatures</strong></a>(self, content, strict<font color="#909090">=False</font>)</dt><dd><tt>This&nbsp;method&nbsp;checks&nbsp;whether&nbsp;the&nbsp;given&nbsp;string&nbsp;matches&nbsp;one&nbsp;of&nbsp;the&nbsp;known&nbsp;file&nbsp;signatures&nbsp;specified&nbsp;in&nbsp;an&nbsp;internal<br>
database.<br>
Expand Down
2 changes: 1 addition & 1 deletion turbodataminer/data/version-vulns.json
Original file line number Diff line number Diff line change
Expand Up @@ -415,7 +415,7 @@
},
"Liferay, headers": {
"alias": "cpe:/a:liferay:liferay_portal",
"regex": "Liferay-Portal:\\s*[a-z\\s]+([\\d.]+)",
"regex": "Liferay-Portal:\\s*[a-zA-Z\\s]+([\\d.]+)",
"type": "cpe"
},
"LinkSmart, script": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,6 @@
],
"uuid": "15302930-aa43-4d7a-88c5-434bc4b9f763",
"version": "v1.0",
"script": "\"\"\"\nThis script searches in-scope HTTP requests and responses for information based on regular expressions\nthat are stored in dictionary session[\"relist\"] and if found, adds the identified value to the table above.\n\nNote that each regular expression must contain the named group \"value\" whose content will be extracted\nadded to the table.\n\"\"\"\nimport re\n# Do the initial setup\nif ref == 1 or \"relist\" not in session:\n\tsession[\"relist\"] = {\n\t\t\t\t\t\t\t\t\t\t\t\"guid\": re.compile(\"[^0-9a-zA-Z](?P<value>[0-9a-zA-Z]{8,8}(-[0-9a-zA-Z]{4,4}){3,3}-[0-9a-zA-Z]{12,12})[^0-9a-zA-Z]\"),\n\t\t\t\t\t\t\t\t\t\t\t#\"meta-tag\": re.compile(\"(?P<value><meta .*?>)\")\n\t\t\t\t\t\t\t\t\t\t\t}\n\theader = [\"Ref.\", \"Host\", \"Extracted From\", \"Category\", \"Value\"]\n\ndef parse_content(content, regexes):\n\t\"\"\"\n\tThis method implements the core functionality to extract information\n\tfrom requests or responses based on the given regular expressions.\n\t\"\"\"\n\trvalues = {}\n\tfor key, value in regexes.items():\n\t\tfor match in value.finditer(content):\n\t\t\tif key in rvalues:\n\t\t\t\trvalues[key].append(match.group(\"value\"))\n\t\t\telse:\n\t\t\t\trvalues[key] = [match.group(\"value\")]\n\treturn rvalues\n\n# Process only in-scope requests and responses\nif in_scope:\n\trequest = message_info.getRequest()\n\tresponse = message_info.getResponse()\n\trequest_string = helpers.bytesToString(request).encode(\"utf-8\")\n\tresponse_string = helpers.bytesToString(response).encode(\"utf-8\") if response else \"\"\n\trequest_list = parse_content(request_string, session[\"relist\"])\n\tresponse_list = parse_content(response_string, session[\"relist\"])\n\tfor key, values in request_list.items():\n\t\trows.extend([[ref, url.getHost(), \"Request\", key, item] for item in values if item != url.getHost()])\n\tfor key, values in response_list.items():\n\t\trows.extend([[ref, url.getHost(), \"Response\", key, item] for item in values if item != url.getHost()])",
"script": "\"\"\"\nThis script searches in-scope HTTP requests and responses for information based on regular expressions\nthat are stored in dictionary session[\"relist\"] and if found, adds the identified value to the table above.\n\nNote that each regular expression must contain the named group \"value\" whose content will be extracted\nand added to the table.\n\"\"\"\nimport re\n# Do the initial setup\nif ref == 1 or \"relist\" not in session:\n\tsession[\"relist\"] = {\n\t\t\t\t\t\t\t\t\t\t\t\"guid\": re.compile(\"[^0-9a-zA-Z](?P<value>[0-9a-zA-Z]{8,8}(-[0-9a-zA-Z]{4,4}){3,3}-[0-9a-zA-Z]{12,12})[^0-9a-zA-Z]\"),\n\t\t\t\t\t\t\t\t\t\t\t#\"meta-tag\": re.compile(\"(?P<value><meta .*?>)\")\n\t\t\t\t\t\t\t\t\t\t\t}\n\theader = [\"Ref.\", \"Host\", \"Extracted From\", \"Category\", \"Value\"]\n\ndef parse_content(content, regexes):\n\t\"\"\"\n\tThis method implements the core functionality to extract information\n\tfrom requests or responses based on the given regular expressions.\n\t\"\"\"\n\trvalues = {}\n\tfor key, value in regexes.items():\n\t\tfor match in value.finditer(content):\n\t\t\tif key in rvalues:\n\t\t\t\trvalues[key].append(match.group(\"value\"))\n\t\t\telse:\n\t\t\t\trvalues[key] = [match.group(\"value\")]\n\treturn rvalues\n\n# Process only in-scope requests and responses\nif in_scope:\n\trequest = message_info.getRequest()\n\tresponse = message_info.getResponse()\n\trequest_string = helpers.bytesToString(request).encode(\"utf-8\")\n\tresponse_string = helpers.bytesToString(response).encode(\"utf-8\") if response else \"\"\n\trequest_list = parse_content(request_string, session[\"relist\"])\n\tresponse_list = parse_content(response_string, session[\"relist\"])\n\tfor key, values in request_list.items():\n\t\trows.extend([[ref, url.getHost(), \"Request\", key, item] for item in values if item != url.getHost()])\n\tfor key, values in response_list.items():\n\t\trows.extend([[ref, url.getHost(), \"Response\", key, item] for item in values if item != url.getHost()])",
"name": "Misc - Template Script to Extract Information From In-Scope Requests and Responses"
}
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,6 @@
],
"uuid": "45516c08-5289-4c66-ae30-8e7c69319e0a",
"version": "v1.0",
"script": "\"\"\"\nThis script adds/from to relationships to the above table by analysing each message info item's Referer and Location header.\nThis information (especially first column of last row) can then be used to plot a relationship map using Python's matplotlib\nand networkx libraries by executing the following steps:\n1. Run the Turbo Data Miner Script\n\n2. Install necessary Python libraries\n$ pip install igviz matplotlib networkx numpy\n\n3. Setup Python environment\n>>> import matplotlib.pyplot as plt\n>>> import networkx as nx\n>>> import igviz as ig\n>>> DG = nx.DiGraph()\n\n4. Click into the first column of the last row, right click, and select 'Copy Cell Value'\n\n5. Select the below placeholder and paste the content of the clipboard\n>>> DG.add_weighted_edges_from([PLACEHOLDER])\n\n6a. Plot the directed graph using matplotlib\n>>> nx.draw(DG, with_labels=True, font_weight='bold')\n>>> plt.show()\n\n6b. Plot the directed graph using igviz\n>>> fig = ig.plot(DG)\n>>> fig.show()\n\"\"\"\n\nfrom java.net import URL\nif ref == 1 or \"dedup\" not in session:\n\theader = [\"From\", \"To\"]\n\tsession[\"dedup\"] = {}\n\tsession[\"rows\"] = []\t\n\nrequest = message_info.getRequest()\nresponse = message_info.getResponse()\nif response:\n\thost_name = get_hostname(url)\n\trequest_info = helpers.analyzeRequest(request)\n\tresponse_info = helpers.analyzeResponse(response)\n\treferer = get_header(request_info.getHeaders(), \"Referer\")[1]\n\tlocation = get_header(response_info.getHeaders(), \"Location\")[1]\n\t# prepare referer and location\n\treferer_str = get_hostname(URL(referer)) if referer else None\n\tlocation_str = get_hostname(URL(location)) if location and location[0] != '/' else None\n\t# Add referer and location headers to table\n\tif referer_str and referer_str != host_name:\n\t\tkey = \"{}>{}\".format(referer_str, host_name)\n\t\tif key not in session[\"dedup\"]:\n\t\t\trows.append([referer_str, host_name])\n\t\t\tsession[\"rows\"].append((\"{}\".format(referer_str), \"{}\".format(host_name), 0.125))\n\t\t\tsession[\"dedup\"][key] = None\n\tif location_str and location_str != host_name:\n\t\tkey = \"{}>{}\".format(host_name, location_str)\n\t\tif key not in session[\"dedup\"]:\n\t\t\trows.append([host_name, location_str])\n\t\t\tsession[\"rows\"].append((\"{}\".format(host_name), \"{}\".format(location_str), 0.125))\n\t\t\tsession[\"dedup\"][key] = None\n# Add the list for the Python library networkx to the last row\nif ref == row_count:\n\trows.append([session[\"rows\"]])\n\t",
"script": "\"\"\"\nThis script adds/from to relationships to the above table by analysing each message info item's Referer and Location header.\nThis information (especially first column of last row) can then be used to plot a relationship map using Python's matplotlib\nand networkx libraries by executing the following steps:\n1. Run the Turbo Data Miner Script\n\n2. Install necessary Python libraries\n$ pip install igviz matplotlib networkx numpy\n\n3. Setup Python environment\n>>> import matplotlib.pyplot as plt\n>>> import networkx as nx\n>>> import igviz as ig\n>>> DG = nx.DiGraph()\n\n4. Click into the first column of the last row, right click, and select 'Copy Cell Value'\n\n5. Select the below PLACEHOLDER and replace it with the content of the clipboard\n>>> DG.add_weighted_edges_from([PLACEHOLDER])\n\n6a. Plot the directed graph using matplotlib\n>>> nx.draw(DG, with_labels=True, font_weight='bold')\n>>> plt.show()\n\n6b. Plot the directed graph using igviz\n>>> fig = ig.plot(DG)\n>>> fig.show()\n\"\"\"\nimport re\nfrom java.net import URL\n\nif ref == 1 or \"dedup\" not in session:\n\theader = [\"From\", \"To\"]\n\tsession[\"dedup\"] = {}\n\tsession[\"rows\"] = []\n\nrequest = message_info.getRequest()\nresponse = message_info.getResponse()\nif response:\n\thost_name = get_hostname(url)\n\trequest_info = helpers.analyzeRequest(request)\n\tresponse_info = helpers.analyzeResponse(response)\n\treferer = get_header(request_info.getHeaders(), \"Referer\")[1]\n\tlocation = get_header(response_info.getHeaders(), \"Location\")[1]\n\t# prepare referer and location\n\treferer_str = get_hostname(URL(referer)) if referer else None\n\tlocation_str = get_hostname(URL(location)) if location and re.match(\"https?://\", location) else None\n\t# Add referer and location headers to table\n\tif referer_str and referer_str != host_name:\n\t\tkey = \"{}>{}\".format(referer_str, host_name)\n\t\tif key not in session[\"dedup\"]:\n\t\t\trows.append([referer_str, host_name])\n\t\t\tsession[\"rows\"].append((\"{}\".format(referer_str), \"{}\".format(host_name), 0.125))\n\t\t\tsession[\"dedup\"][key] = None\n\tif location_str and location_str != host_name:\n\t\tkey = \"{}>{}\".format(host_name, location_str)\n\t\tif key not in session[\"dedup\"]:\n\t\t\trows.append([host_name, location_str])\n\t\t\tsession[\"rows\"].append((\"{}\".format(host_name), \"{}\".format(location_str), 0.125))\n\t\t\tsession[\"dedup\"][key] = None\n# Add the list for the Python library networkx to the last row\nif ref == row_count:\n\trows.append([session[\"rows\"]])\n\t",
"name": "Misc - Template Script to Visualize Web Site Relationships"
}
Loading

0 comments on commit 808b65c

Please sign in to comment.