Ad

I Have Disallowed Everything For 10 Days

- 1 answer

Due to an update error, I put in prod a robots.txt file that was intended for a test server. Result, the prod ended up with this robots.txt :

User-Agent: *
Disallow: /

That was 10 days ago and I now have more than 7000 URLS blocked Error (Submitted URL blocked by robots.txt) or Warning (Indexed through blocked byt robots.txt). enter image description here

Yesterday, of course, I corrected the robots.txt file.

What can I do to speed up the correction by Google or any other search engine?

Ad

Answer

You could use the robots.txt test feature. https://www.google.com/webmasters/tools/robots-testing-tool

Once the robots.txt test has passed, click the "Submit" button and a popup window should appear. and then click option #3 "Submit" button again --

Ask Google to update Submit a request to let Google know your robots.txt file has been updated.

Other then that, I think you'll have to wait for Googlebot to crawl the site again.

Best of luck :).

Ad
source: stackoverflow.com
Ad