[ad_1]

Google’s John Mueller mentioned on Reddit that disallowing URLs with UTM parameters in them will not allow you to to enhance crawling or ranking with Google Search. He added {that a} website ought to attempt to hold its inside URLs clear and constant, however over time, the canonical tags ought to assist with exterior hyperlinks that carry UTM parameters on them.

John wrote, “I doubt you’d to see any seen results in crawling or rating from this. (And if there is no worth from doing it, why do it?)” When he was requested about disallowing such URLs.

He added:

Typically talking, I would nonetheless attempt to enhance the location in order that irrelevant URLs do not must be crawled (inside linking, rel-canonical, being per URLs in feeds). I believe that is smart by way of having issues cleaner & simpler to trace – it is good website-hygiene. If in case you have random parameter URLs from exterior hyperlinks, these would get cleaned up with rel-canonical over time anyway, I would not block these with robots.txt. For those who’re producing random parameter URLs your self, say throughout the inside linking, or from feeds submissions, that is one thing I would clear up on the supply, reasonably than blocking it with robots.txt.

tldr: clear web site? sure. block random crufty URLs from exterior? no.

That is all similar to earlier recommendation from John Mueller that I quoted in these tales:

Discussion board dialogue at Reddit.

[ad_2]

Source link

Leave A Reply