Crawlers And Indexing Setting In Blogger | Fact On Web

 If you want to correct the search engine setting of your blogs, then we are going to tell you the right and simple way in this blog.

1.As you can see in the picture first you have to enable custom robots.txt 

2. As you can see in our given example how to fill Custom Robots.txt all you have to do is write your website

User-agent: *
Disallow: /search
Allow: /

3. Custom Robots Header tags is also very important to enable as you can see below

Before proceeding with the setting of blogger, you should know the full form of some page tags which you will get next;

  • all: it means there aren't any restrictions for crawlers to crawl or index any page.
  • noindex: if this option is selected for any page, then the crawler can't crawl or index that page.
  • nofollow: blog posts contain both internal and external links. Crawlers automatically crawl all the links on any webpage, but if you don't want crawlers to crawl any links on any page, then you have to enable this option.
  • none: it means both "noindex" and "nofollow" at once.
  • noarchive: it will remove any cached link from search.
  • nosnippet: it will stop search engines to show snippets of your web pages.
  • noodp: it will remove titles and snippets from open directory projects like DMOZ.
  • notranslate: it will stop crawlers to show translation on pages.
  • noimageindex: in this case, your post will be crawled and indexed, but all the images will not be indexed.
  • unavailable_after: this option helps to noindex any post after a particular time.

4. Custom robot tags for home page: all, noodp

5.Custom robot tags for archive and search pages: noindex, noodp

6.Custom robot tags for posts and pages: all, noodp