Skip to content
This repository has been archived by the owner on Apr 21, 2023. It is now read-only.
/ awsscraper Public archive

A tool to scrape AWS for external endpoints (Route53 and Elastic IPs)

License

Notifications You must be signed in to change notification settings

telenordigital/awsscraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

AWSSCRAPER

The easiest way to view all public endpoints within your AWS Accounts.

This tool supports

  • AWS Route53, TLDs and EIPs
  • Link between multiple teams and accounts
  • Exporting csv or JSON
  • API endpoints

awsscraper

Usage (docker):

Install Docker

Install docker (from the docker website- OS repos are usually old).

Build Docker image

$ docker build -t awsscraper:latest .

Run Docker image

To run awsscraper, and expose the website on port 80 of 127.0.0.1: $ aws-vault exec default -- ./run_docker.sh Make sure the aws-credentials used have the necessary creds to access the audit role..

Usage:

Install (for use or dev)

$ git clone <repo>
$ cd <repo>/awsscraper
$ pipenv install --three

Commandline

  • run pipenv run ./dnsscraper.py and pass in the account name (using aws-vault for aws credentials)

Web

  • run pipenv run python routes.py
  • visit 127.0.0.1/test/accountname

Requirements:

  • Python3
  • Pipenv
  • built in python libraries socket and json
  • Audit role access- either a roles switch or a custom account with access

TODO:

  • add note on api endpoints on default page/footer
  • Add full rest API (JSON results, not html)
  • refactor error handling in scrape_aws/processfunction to handle accounts independently
  • try/catches in routes.py
  • Improve web templates, and add JS sorting and filtering

About

A tool to scrape AWS for external endpoints (Route53 and Elastic IPs)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published