Robots.txt in Magento-2

  Last Update - 2023-07-04

As you know, configuring robot.txt is important for any website that is working on a site’s SEO. Particularly, when you configure the sitemap to allow search engines to index your store, it is necessary to give web crawlers the instructions in the robot.txt file to avoid indexing the disallowed sites. The robot.txt file, that resides in the root of your Magento installation, is directive that search engines such as Google, Yahoo, Bing can recognize and track easily. In this post, I will introduce the guides to configure the robot.txt file so that it works well with your site.

configure robots.txt in Magento 2

Admin Panel Content >> Design >> Configuration

Select Edit Current Design on your store

Open Search Engine Robots Parallax

Examples of Custom Instructions

  • Allows Full Access
User-agent:* Disallow:

 

  • Disallows Access to All Folders
User-agent:* Disallow: 

 

  • Default Instructions
Disallow: /lib/ 
Disallow: /*.php$ 
Disallow: /pkginfo/ 
Disallow: /report/ 
Disallow: /var/ 
Disallow: /catalog/ 
Disallow: /customer/ 
Disallow: /sendfriend/ 
Disallow: /review/ 
Disallow: /*SID=

Brijesh Patel

Share on Facebook Twitter LinkedIn