Customize

The robots.txt file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:

  • The user agent, which notes which crawler the group of rules applies to. For example, adsbot-google.

  • The rules themselves, which note specific URLs that crawlers can, or can't, access.

  • An optional sitemap URL.

Shopify generates a default robots.txt file that works for most stores. However, you can add the robots.txt.liquid templatearrow-up-right to make customizations.

In this tutorial, you'll learn how you can customize the robots.txt.liquid template.

Requirements

Add the robots.txt.liquid template with the following steps:

  1. In the code editor for the theme you want to edit, open the Templates folder.

  2. Click Add a new template.

  3. Select robots.txt under the Create a new template for drop-down menu.

  4. Click Create template.

Resources

The robots.txt.liquid template supports only the following Liquid objects:

Add a new rule to an existing group

If you want to add a new rule to an existing group, then you can adjust the Liquid for outputting the default rulesarrow-up-right to check for the associated group and include your rule.

For example, you can use the following to block all crawlers from accessing pages with the URL parameter ?q=:

Remove a default rule from an existing group

If you want to remove a default rule from an existing group, then you can adjust the Liquid for outputting the default rulesarrow-up-right to check for that rule and skip over it.

For example, you can use the following to remove the rule blocking crawlers from accessing the /policies/ page:

Add custom rules

If you want to add a new rule that's not part of a default group, then you can manually enter the rule outside of the Liquid for outputting the default rulesarrow-up-right.

Common examples of these custom rules are:

Block certain crawlers

If a crawler isn't in the default rule set, then you can manually add a rule to block it.

For example, the following directive would allow you to block the discobot crawler:

Allow certain crawlers

Similar to blocking certain crawlers, you can also manually add a rule to allow search engines to crawl a subdirectory or page.

For example, the following directive would allow the discobot crawler:

Add extra sitemap URLs

The following example, where [sitemap-url] is the sitemap URL, would allow you to include an extra sitemap URL:

Last updated