Ad

Block 100s Of Url From Search Engine Using Robots.txt

I've about 100 pages of on my website which I don't want to be indexed in google...is there any way to block it using robots.txt..It'd be very tiresome to edit each page and add noindex meta tag....

All the urls which I want to block goes like...

www.example.com/index-01.html

www.example.com/index-02.html

www.example.com/index-03.html

www.example.com/index-04.html .

. . .

www.example.com/index-100.html

Not sure but will adding something like the following work?

User-Agent: *
Disallow: /index-*.html
Ad

Answer

Yes it will work using wildcard

Ref : "https://geoffkenyon.com/how-to-use-wildcards-robots-txt"  
Ad
source: stackoverflow.com
Ad