Skip to content

Commit

Permalink
robots.txt: disallow crawling when not in production. (#27559)
Browse files Browse the repository at this point in the history
  • Loading branch information
XhmikosR authored Nov 2, 2018
1 parent 4b15ec9 commit 3256a2c
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion site/robots.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,5 @@

# Allow crawling of all content
User-agent: *
Disallow:
Disallow:{% if jekyll.environment != "production" %} /{% endif %}
Sitemap: {{ site.url }}/sitemap.xml

0 comments on commit 3256a2c

Please sign in to comment.