How to edit robots.txt in shopify​

How to edit robots.txt in Shopify

In today’s blog we’ll talk about robots.txt.liquid, for those who don’t know what is robots.txt why it is an important part of SEO, what it does?, and How to edit it. We will give all your questions answers in this blog. Please read it till the end.

What is Robots.txt

Robots.txt helps us to deny those URLs which we do not want to crawl. Many times it happens that some URLs get automatically indexed on Google, to stop the indexing of those URLs we use Robots.txt which is always located at https://yoursite.com/robots.txt 

How to edit Robots.txt

Step-by-step process to edit robots.txt
1. Login to your Shopify admin panel
2. Go to online store
3. Then click on three dots near the customize button on the right side
4. Select edit code
5. Select robots.txt under templates.
Note: By default the robots.txt is not enabled, To enable you have to select robots.txt under add a new template.
The robots.txt should look like this

robots.txt

Robots.txt.liquid comes with a default code that adds all the default rules Shopify uses out of the box.
Suggestion: Don’t remove the default code because it contains well-optimised rules by Shopify.

How to add rules in robots.txt

To add the rules here is the step-by-step process
1. Audit the URLs
2. Creating robots.txt
3. Checking the URLs using tech seo website
4. Adding them to robots.txt.liquid

Auditing URLs

Performing a URL audit requires technical SEO skills. But in this blog, I will tell you a simple process of how can you audit your store URLs. Go to GSC open page indexing and audit the URLs that you think these URLs don’t index on Google SERPs like the admin page, cart, orders, account etc. You can use Screaming Frog it is also a great tool to audit your URLs.

Create robots.txt

Let’s create robots.txt, the rules start with user-agent: * you can add any bot in place of * (ex: user-agent: semrush). Now let’s disallow those pages by entering disallow as shown below image and lastly add sitemap in the last.

robots.txt

Checking the URLs

After performing a URL audit, let’s move on to a technical SEO website to check whether your created rules are working. and Finally, click on a test to show the results

technical seo

Put rules in the box as shown below image, enter your URL, select the user agent as Googlebot or any other bot and finally click on test to show results

robots.txt validator

And the final result looks like this.

Adding them to robots.txt.liquid

The final step to add those newly created rules is to paste them into robots.txt.liquid below the default rules created by Shopify and click on save. That’s it you have successfully created robots.txt

robots shopify

To check the changes add robots.txt end of the URL ex: https://webbytroops.com/robots.txt

Go To Top