The following steps show you how to configure Robot txt in Magento 2 ?
- On the Admin sidebar, select Stores
- Develop the Search Engine Robots section
- When complete, choose Save Config
As you know, the Magento configuration contains settings to create and control instructions for a Web Crawlers and bots that index your site. The guidelines are saved in a file called “robots.txt” that resides in the root of your Magento installation. The instructions are commands that are documented and followed by most search engines. By default, the robots.txt file that is produced by Magento consists of commands for web crawler to evade indexing assured parts of the site that contain files that are used internally by the system. You can use the default settings or describe your own custom instructions for all or for specific search engines.
Here we will discuss about how to configure robot.txt and establish about some Custom Instruction
- On the Admin sidebar, choose Stores
In settings, select configuration. Then, select Design in the panel on the left under General.
- Enlarge the Search Engine Optimization Section
The following are:
Select Default Robots
- INDEX, FOLLOW initiates web crawlers to catalog the site and to check back later for modify.
- NO INDEX, FOLLOW initiates web crawlers to evade indexing the site, but to check back later or modify.
- INDEX, NO FOLLOW initiates web crawlers to index the site once, but to not check back later for modify.
- NO INDEX, NO FOLLOW initiates web crawlers to evade indexing the site and not verify back later for modify.
After that, tap Reset to Default to restore the default instructions.
- When complete, Save Config
Here, we will establish some examples of Custom Instruction, the following are:
- Allows full Access
- Disallows access to all Folders
- Default instructions
Allows Full Access
Disallows Access to All Folders
- Disallow: /
The following steps help you to learn how to Configure Robots in Magento 2
Last Update: April 9, 2018