Ad
Not Understanding This Robots.txt
Another company has set up the robots.txt for a site I manage. This is the code they used:
User-agent: googlebot
User-agent: google
User-agent: bingbot
User-agent: bing
Allow: /products/
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Disallow: /sales/
Disallow: /products/
Allow: /wp-content/uploads/
Allow: /wp-content/themes/
Allow: /wp-admin/admin-ajax.php*
I see Disallow: /products/ and allow /products/. I don't understand why they wrote it down like this. Should I change anything?
Ad
Answer
- The first directive,
Allow: /products/
, is valid for google and bing bots. - The second directive,
Disallow: /products/
, is valid for all other bots, `User-agent: *´.
I don't see much sense in these rules, but they don't violate standard.
Ad
source: stackoverflow.com
Related Questions
- → CORS missmatch because of http
- → Building sitemap for 2 wordpress install under 1 domain
- → How to remove empty elements after class?(jQuery)
- → Get width of an element and apply to another relative to the first one?
- → How to remove caption p class from wordpress?
- → 301 Redirection from no-www to www in wordpress
- → Laravel 5 routing using prefix
- → WordPress - Header position Top, Left &Right
- → how to add rel=nofollow to some specific external links in wordpress
- → octobercms install error: database is not empty?
- → How to edit the index page of wordpress theme?
- → How to select a Post Type (Wordpress) to pass a filter in head?
- → What sort of URL structure should be used to display AMP HTML vs vanilla HTML
Ad