SEO Robots is a plugin for Grav CMS that provide a simple way to manage robots from admin.
Installing the SEO Robots plugin can be done in one of two ways. The GPM (Grav Package Manager) installation method enables you to quickly and easily install the plugin with a tiny terminal command, while the manual method enables you to do so via a zip file.
The tinyst way to install this plugin is via the Grav Package Manager (GPM) through your system's terminal (also called the command line). From the root of your Grav install type:
bin/gpm install seo-robots
This will install the SEO Robots plugin into your /user/plugins
directory within Grav. Its files can be found under /your/site/grav/user/plugins/seo-robots
.
To install this plugin, just download the zip version of this repository and unzip it under /your/site/grav/user/plugins
. Then, rename the folder to seo-robots
. You can find these files on GitHub or via GetGrav.org.
You should now have all the plugin files under
/your/site/grav/user/plugins/seo-robots
NOTE: This plugin is a modular component for Grav which requires Grav and the Error and Problems to operate.
If you use the admin plugin, you can install directly through the admin plugin by browsing the Plugins
tab and clicking on the Add
button.
Before configuring this plugin, you should copy the user/plugins/seo-robots/seo-robots.yaml
to user/config/plugins/seo-robots.yaml
and only edit that copy.
Here is the default configuration and an explanation of available options:
enabled: true
meta_robots:
index: true
follow: true
noindex: false
nofollow: false
noimageindex: false
You just have to enable the plugin and configure your default settings to auto generate seo tags. You can see the result by looking at the source in the browser.
Robots meta directives (sometimes called “meta tags”) are pieces of code that provide crawlers instructions for how to crawl or index web page content. Whereas robots.txt file directives give bots suggestions for how to crawl a website's pages, robots meta directives provide more firm instructions on how to crawl and index a page's content.
You can choose between 5 mode:
- index: Tells a search engine to index a page. Note that you don’t need to add this meta tag; it’s the default.
- follow: Even if the page isn’t indexed, the crawler should follow all the links on a page and pass equity to the linked pages.
- noindex: Tells a search engine not to index a page.
- nofollow: Tells a crawler not to follow any links on a page or pass along any link equity.
- noimageindex: Tells a crawler not to index any images on a page.
If SEO Robots is correctly configured and enabled you will see a new SEO tab when editing a page in the admin.
In this tab each field have a placeholder that reflect the actual default value generated by SEO Robots.
To override the default value simply fill the corresponding field.
To get default value back simply remove you input.