Skip to content

Commit

Permalink
DOCS: Add site search optimization (#58)
Browse files Browse the repository at this point in the history
  • Loading branch information
jdillard authored Jan 28, 2023
1 parent 8eff853 commit f6c998c
Show file tree
Hide file tree
Showing 4 changed files with 20 additions and 3 deletions.
2 changes: 2 additions & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ Changelog
`#55 <https://github.com/jdillard/sphinx-sitemap/pull/55>`_
* |:sparkles:| NEW: Add support for Sphinx config "html_file_suffix"
`#57 <https://github.com/jdillard/sphinx-sitemap/pull/57>`_
* |:books:| DOCS: Add site search optimization
`#58 <https://github.com/jdillard/sphinx-sitemap/pull/58>`_

2.4.0
-----
Expand Down
1 change: 1 addition & 0 deletions docs/_vale/ignore_words.txt
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
Conda
Algolia
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ documentation.

getting-started
advanced-configuration
search-engine-optimization
search-optimization
configuration-values
contributing
changelog
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
Getting the Most out of the Sitemap
===================================

Search Engine Optimization
--------------------------

Using robots.txt
----------------
^^^^^^^^^^^^^^^^

Add a **robots.txt** file in the **source** directory which contains a link to the **sitemap.xml** or **sitemapindex.xml** file. For example::

Expand All @@ -17,6 +20,17 @@ Then, add **robots.txt** to :confval:`html_extra_path` in **conf.py**:
html_extra_path = ['robots.txt']
Submit Sitemap to Search Engines
--------------------------------
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Submit the **sitemap.xml** or **sitemapindex.xml** to the appropriate search engine tools.

Site Search Optimization
------------------------

Site search crawlers can also take advantage of sitemaps as starting points for crawling.

Examples:

- `Algolia`_

.. _Algolia: https://www.algolia.com/doc/tools/crawler/apis/configuration/sitemaps/

0 comments on commit f6c998c

Please sign in to comment.