# Customize

The `robots.txt` file tells search engines which pages can, or can't, be crawled on a site. It contains groups of rules for doing so, and each group has three main components:

* The user agent, which notes which crawler the group of rules applies to. For example, `adsbot-google`.
* The rules themselves, which note specific URLs that crawlers can, or can't, access.
* An optional sitemap URL.

Shopify generates a default `robots.txt` file that works for most stores. However, you can add the [`robots.txt.liquid` template](https://shopify.dev/docs/themes/architecture/templates/robots-txt-liquid) to make customizations.

In this tutorial, you'll learn how you can customize the `robots.txt.liquid` template.

### Requirements <a href="#requirements" id="requirements"></a>

Add the `robots.txt.liquid` template with the following steps:

1. In the code editor for the theme you want to edit, open the **Templates** folder.
2. Click **Add a new template**.
3. Select `robots.txt` under the **Create a new template for** drop-down menu.
4. Click **Create template**.

### Resources <a href="#resources" id="resources"></a>

The `robots.txt.liquid` template supports only the following Liquid objects:

* [`robots`](https://shopify.dev/docs/api/liquid/objects/robots)
* [`group`](https://shopify.dev/docs/api/liquid/objects/group)
* [`rule`](https://shopify.dev/docs/api/liquid/objects/rule)
* [`user_agent`](https://shopify.dev/docs/api/liquid/objects/user_agent)
* [`sitemap`](https://shopify.dev/docs/api/liquid/objects/sitemap)

#### Add a new rule to an existing group <a href="#add-a-new-rule-to-an-existing-group" id="add-a-new-rule-to-an-existing-group"></a>

If you want to add a new rule to an existing group, then you can adjust the Liquid for [outputting the default rules](https://shopify.dev/docs/themes/architecture/templates/robots-txt-liquid#content) to check for the associated group and include your rule.

For example, you can use the following to block all crawlers from accessing pages with the URL parameter `?q=`:

{% code overflow="wrap" lineNumbers="true" %}

```liquid
{% for group in robots.default_groups %}
  {{- group.user_agent }}

  {%- for rule in group.rules -%}
    {{ rule }}
  {%- endfor -%}

  {%- if group.user_agent.value == '*' -%}
    {{ 'Disallow: /*?q=*' }}
  {%- endif -%}

  {%- if group.sitemap != blank -%}
      {{ group.sitemap }}
  {%- endif -%}
{% endfor %}
```

{% endcode %}

#### Remove a default rule from an existing group <a href="#remove-a-default-rule-from-an-existing-group" id="remove-a-default-rule-from-an-existing-group"></a>

If you want to remove a default rule from an existing group, then you can adjust the Liquid for [outputting the default rules](https://shopify.dev/docs/themes/architecture/templates/robots-txt-liquid#content) to check for that rule and skip over it.

For example, you can use the following to remove the rule blocking crawlers from accessing the `/policies/` page:

```liquid
{% for group in robots.default_groups %}
  {{- group.user_agent }}

  {%- for rule in group.rules -%}
    {%- unless rule.directive == 'Disallow' and rule.value == '/policies/' -%}
      {{ rule }}
    {%- endunless -%}
  {%- endfor -%}

  {%- if group.sitemap != blank -%}
      {{ group.sitemap }}
  {%- endif -%}
{% endfor %}
```

### Add custom rules <a href="#add-custom-rules" id="add-custom-rules"></a>

If you want to add a new rule that's not part of a default group, then you can manually enter the rule outside of the Liquid for [outputting the default rules](https://shopify.dev/docs/themes/architecture/templates/robots-txt-liquid#content).

Common examples of these custom rules are:

**Block certain crawlers**

If a crawler isn't in the default rule set, then you can manually add a rule to block it.

For example, the following directive would allow you to block the `discobot` crawler:

```
<!-- Liquid for default rules -->

User-agent: discobot
Disallow: /
```

**Allow certain crawlers**

Similar to blocking certain crawlers, you can also manually add a rule to allow search engines to crawl a subdirectory or page.

For example, the following directive would allow the `discobot` crawler:

```
<!-- Liquid for default rules -->

User-agent: discobot
Allow: /
```

**Add extra sitemap URLs**

The following example, where `[sitemap-url]` is the sitemap URL, would allow you to include an extra sitemap URL:

```
<!-- Liquid for default rules -->

Sitemap: [sitemap-url]
```


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://mac-eth.gitbook.io/macs-web-dev-sop/seo/robots.txt.liquid-file/customize.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
