Robot.txt Function On Blogger

Robot.txt function On Blogger - Robot.txt on blogger is very important to use to help to crawl and to index blog in SE (Search Engine).
Robot.txt is simply a convention to govern search engine robots or web crawlers or also prevent all pages to be indexed.


Suppose you do not want certain pages displayed to search engines! and with this Robot.txt we can take advantage of the function of robot.txt in order to block the page. 

All Blogspot already have txt robot feature that has been given by blogger, by default robot.txt as below:

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://arbloggerlab.blogspot.com/sitemap.xml

Surely you will be confused, what exactly the meaning of the code above, okay I will explain it one by one.

Information
User-agent: Mediapartners-GoogleGoogle's user agent signifies that this blog belongs to Google
DisallowNot allowed to exist
User-agent: *All Search Engine Robots
Disallow: / searchNot allowed to crawl folder search and so on
Allow: /Allow all pages to be crawled, except those already disallow.
Sitemap:https://arbloggerlab.blogspot.com/sitemap.xmlAddress your blog/sitemap blog feed

Maybe that's just a little explanation about the function of Robot.txt hopefully this article can be useful.