Skip to main content

Customizing a site's robots.txt

A robots.txt file provides crawling guidance to search engines, such as the following:

  • Which user agents are allowed to crawl or not crawl
  • Which URLs to crawl or not crawl
  • Locations of sitemaps
  • Limiting the frequency of crawling

Properly configuring your site's robots.txt enhances search-engine optimization.

Brightspot provides a default robots.txt with the following directives:

User-agent: *
Crawl-delay: 10

Brightspot creates the file robots.txt at the location in the field Main > Default Site URL. For example, if your site's default URL is https://brightspot.com, Brightspot creates the robots.txt file at https://brightspot.com/robots.txt.

To customize a site's robots.txt:

  1. Click > Admin > Sites & Settings.
  2. In the Sites widget, select the site for which you are configuring robots.txt, or select Global to configure robots.txt for all sites.
  3. Click , located to the left of , and type robots.txt.
  4. In the robots.txt field, enter directives for the search engine. See the search engine's documentation for the list of honored directives.
  5. Click Save.

See also: