How To Block Subdomain Used For URL Shortening Service With Robots.txt?
Let’s say my domain is
www.example.com, I had set up my main website (set up with Blogger) and used
go.example.com for URL shortening (setup with GoDaddy Shortened Service). Now, I want to block all
go.example.com URLs so that they can't be indexed.
If I use
go.example.com URLs in my main blog, then, does this affect my blog search engine optimization?
With this robots.txt on
go.example.com, you disallow conforming bots to crawl any URL from that host:
# https://go.example.com/ User-agent: * Disallow: /
If you are fine with crawling, but you want to prevent indexing (you can’t have both), you have to use
noindex (either in the
robots element or in the HTTP header
X-Robots-Tag). Note that you can>robots element or in the HTTP header
X-Robots-Tag). Note that you can’t use
noindex as link type in the
That said, if you always redirect (e.g., with 301) from
go.example.com to the canonical URL (on another host), search engine bots have no reason to index your document on
go.example.com, as you don’t provide any content, only the redirect.
- → How do I call the value from another backed page form and use it on a component in OctoberCms
- → Inline JS callback syntax Vs Internal JS callback syntax
- → Prevent form action from being followed by robots
- → How to remove parameters from the root URL if it does I18n
- → SEO Canonical URL in Greek characters
- → Htaccess negation
- → Wrapping anchor elements with text
- → Adding schema.org to site
- → dynamic php title depends on page id and using of "if (isset"
- → Building sitemap for 2 wordpress install under 1 domain
- → Bigcommerce Repeating url
- → 301 Redirection from no-www to www in wordpress
- → Wrapper inside or outside HTML5 semantics main, which is best practice?