Skip to content

Latest commit

 

History

History
50 lines (35 loc) · 1.37 KB

README.md

File metadata and controls

50 lines (35 loc) · 1.37 KB

About CrawlBot

CrawlBot is a simple shell script which can be used to crawl the 'robots.txt' of a given domain. If the process is successful, the result will be listed into 3 different text files (allows.txt, disallows.txt, all.txt) in the created domain directory.

Screenshots

Installation

user@example:~git clone
user@example:~cd crawlbot
user@example:~chmod +x *
user@example:~./crawlbot.sh domain_name

Usage

example:~./crawlbot.sh domain_name
user@example:~./crawlbot.sh google.com

Credits

Special thanks to:

Version

Current version is 1.0