Change log

Sitemap.xml / Robots.txt

The Sitemap feature allows you to manage the sitemap.xml content, which lists your site’s important pages/items, their priority and last modified date, in XML format, for SEO and site indexing purposes.

Sitemap.xml

3rd party services, such as search engines, can reference the sitemap.xml contents to crawl your sites’s pages and get a better understanding of those page’s priority and relevancy.

XML content can be added here manually (such as typing it in or pasting in XML generated content from a external sitemap generator) or it can be generated by WebinOne (using the “Generate Sitemap” button) which will be based off SEO settings applied to your site’s individual pages and module items and/or the overall SEO settings found under ‘Settings’ > ‘SEO’. Here you can define excluded items, bulk assign items and set automatic sitemap generation. For more details on these settings see here.

When content is saved here, or generated automatically, a sitemap.xml file will be created in the root directory of your site which will also be accessible via the File Manager and FTP.

Any changes saved here, or generated automatically, will override any existing sitemap.xml file in the root directory.

Robots.txt

WebinOne does not generate a robots.txt file or directly provide a specific tool for managing one, however, we thought it’s worth mentioning here as it relates to the crawling of your site’s pages and files by 3rd party services, including search engines, and is often use to point to your sitemap.xml file.

While there is no tool specifically for robots.txt management, you can use the File Manager to easily and conveniently create and manage text-based files.

The robots.txt file would be created in the root directory of your site with the file name robots.txt

A minimal example of a robots.txt file which allows all crawling agents with no site restrictions and a sitemap reference is as follows:

user-agent: *
disallow:
sitemap: https://www.yourdomain.com/sitemap.xml

For more information on the robots.txt file uses, options and limitations check the ‘External Resources’ section below.



Related Articles

  • Site Settings & Management
    SEO

    Configuring these settings will help search engines and accessibility systems better understand the context of your website.
  • Site Settings & Management
    API Applications

  • Site Settings & Management
    HTTP Header Settings

External Resources


Please let us know if you have any other contributions or know of any helpful resources you'd like to see added here.


Questions?

We are always happy to help with any questions you may have.
Visit the Treepl Forum for community support and to search previously asked questions or send us a message at support@webinone.com and we will consult you as soon as possible.