Ad

Robots.txt Disallow Path With Regular Expression

- 1 answer

Does robots.txt accept regular expression ? I have many URLs with this format:

https://example.com/view/99/title-sample-text
ID ----------------------^
Title -----------------------------^

I used this:

Disallow: /view

But look like this not working because google indexed more pages. so i want to do this with regex, something like this:

Disallow: /view/([0-9]+)/([^/]*)

But is this correct format or valid in robots.txt ?

Ad

Answer

You can use a wildcard ...

User-agent: *
disallow: /view/*

See https://webmasters.stackexchange.com/questions/72722/can-we-use-regex-in-robots-txt-file-to-block-urls

Hope this helps.

Ad
source: stackoverflow.com
Ad