Ad

How To Tell Search Engines To Use My Updated Robots.txt File?

- 1 answer

Before, I had blocked the search engine robots to prevent crawling my website using the robots.txt file but now I want to unblock them.

I updated the robots.txt file and allowed the search engine robots to crawl my website but it seems the search engines still use my old robots.txt file, How I can tell the search engines to use my new robots.txt file? or is there something wrong in my robots.txt file?

The content of my old robots.txt file:

User-agent: *
Disallow: /

The content of my new robots.txt file:

User-agent: *
Allow: /

# Disallow these directories, url types & file-types
Disallow: /trackback/
Disallow: /wp-admin/
Disallow: /wp-content/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Disallow: /wp-
Disallow: /cgi-bin
Disallow: /readme.html
Disallow: /license.txt
Disallow: /*?*
Disallow: /*.php$
Disallow: /*.js$
Disallow: /*.inc$
Disallow: /*.css$
Disallow: /*.gz$
Disallow: /*.wmv$
Disallow: /*.cgi$
Disallow: /*.xhtml$
Disallow: /*/wp-*
Disallow: /*/feed/*
Disallow: /*/*?s=*
Disallow: /*/*.js$
Disallow: /*/*.inc$

Allow: /wp-content/uploads/

User-agent: ia_archiver*
Disallow: /

User-agent: duggmirror
Disallow: /

Sitemap: https://example.com/sitemap.xml

Ad

Answer

Will need to be done independently for each search engine, otherwise it may just happen over time. For Google, use the Google Search Console tool. This will allow you to upload new robots.txt and submit for recrawling.

Ad
source: stackoverflow.com
Ad