rchalana

What is the Query to add in robots.txt for blocking URLs has some special characters?

by @rchalana (122), 1 year ago

I was going through the Google Search Console pages issues and one of the issue reason is "Alternate page with proper canonical tag" and this has some URLs which look like below

example.com/abc/?hstc_sdb.sdhjb16573.328y......and so on example.com/abc/?utm_term........and son on example.com/abc/?job... and so on example.com/abc/?amp... and so on example.com/abc/?hs... and so on

and many more like this, however, i am not sure from where these are coming so i though of adding queries in robots.txt for disallowing some attributes and to avoid these types of pages to be crawled by search engines (My 1st question if this is the right and good approach)

2nd questions is what can be the exact queries to be added in robots.txt for above mentioned URLs, are the below one will solve the purpose?

Disallow: /?amp Disallow: /?hs Disallow: /?__hstc Disallow: /?job Disallow: /?utm

if not what are the right queries?

Its a wordpress website

1323 Views
0 Upvotes
0 Replies
0 Users

Join the forum to unlock true power of SEO community

You're welcome to become part of SEO Forum community. Register for free, learn and contribute.

Log In Sign up
Want to see your Ad here?

Do you have a product or service you'd like to promote on Seoforum? It's easy. Contact us.