Skip to content
This repository has been archived by the owner on Jan 24, 2019. It is now read-only.

Provide a robots.txt that denies all crawlers #90

Merged
merged 1 commit into from
May 10, 2015

Conversation

mbland
Copy link
Contributor

@mbland mbland commented May 10, 2015

All the trawling through logs as part of #88 and #89 made me think a /robots.txt endpoint might be useful. ;-)

cc: @jehiah

jehiah added a commit that referenced this pull request May 10, 2015
Provide a robots.txt that denies all crawlers
@jehiah jehiah merged commit 5c03fe3 into bitly:master May 10, 2015
@mbland mbland deleted the robots-txt branch May 10, 2015 23:39
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants