Skip to content

shareefshaz/crawlbot

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 

Repository files navigation

About CrawlBot

CrawlBot is a simple shell script which can be used to crawl the 'robots.txt' of a given domain. If the process is successful, the result will be listed into 3 different text files (allows.txt, disallows.txt, all.txt) in the created domain directory.

Screenshots

Installation

user@example:~git clone https://github.com/adhithyanmv/crawlbot.git
user@example:~cd crawlbot
user@example:~chmod +x *
user@example:~./crawlbot.sh domain_name

Usage

example:~./crawlbot.sh domain_name
user@example:~./crawlbot.sh google.com

Credits

Special thanks to:

Version

Current version is 1.0

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Shell 100.0%