Ad

Not Understanding This Robots.txt

- 1 answer

Another company has set up the robots.txt for a site I manage. This is the code they used:

User-agent: googlebot
User-agent: google
User-agent: bingbot
User-agent: bing
Allow: /products/

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Disallow: /sales/
Disallow: /products/
Allow: /wp-content/uploads/
Allow: /wp-content/themes/
Allow: /wp-admin/admin-ajax.php*

I see Disallow: /products/ and allow /products/. I don't understand why they wrote it down like this. Should I change anything?

Ad

Answer

  1. The first directive, Allow: /products/, is valid for google and bing bots.
  2. The second directive, Disallow: /products/, is valid for all other bots, `User-agent: *´.

I don't see much sense in these rules, but they don't violate standard.

Ad
source: stackoverflow.com
Ad