Ad

Robots.txt With Mutltiple Domain Sitemap Entries

Our website has many domain names like:

example.co.uk
example.in
example.co.eg
...

So in robots.txt there should be an entry that shows where our sitemap file is located.

So my question is:

We have many sitemap files per each domain, so how in a single robots.txt file we can put these entries respectively for each domain?

Ad

Answer

Assuming you have the same robots rules for all domains, place your single robots.txt into the root and put all sitemap links into it. I think the sitemap links would look like

http://www.example.co.uk/sitemap.xml http://www.example.in/sitemap.xml http://www.example.co.eg/sitemap.xml

Or you can establish a construction like

/robots/co-uk.txt /robots/in.txt /robots/co-eg.txt

put into them according sitemap links and deliver them dynamically according to user country with

<ifModule mod_geoip.c> GeoIPEnable On RewriteCond %{ENV:GEOIP_COUNTRY_CODE} ^UK$ RewriteRule ^robots\.txt$ robots/co-uk.txt [L] </ifModule>

For this your should have mod_geoip or ip2location installed. If using ip2location replace GEOIP_COUNTRY_CODE with IP2LOCATION_COUNTRY_SHORT.

Ad
source: stackoverflow.com
Ad