Google’s John Mueller said (again) that the Google URL Parameters tool is scary. He said this on Reddit, and then suggested the person try to use noindex or robots.txt methods as an alternative because the Google Search Console reports will show you issues with noindex or robots.txt but not really with the URL Parameters tool.
The question on Reddit was “Is the next logical step using the the URL parameter tool or noindex when Google keeps indexing parameter urls, such as “search?” and “sort?, even though an XML sitemap and canonicals are in place?”
John replied “noindex or robots.txt. The URL parameter handling tool is scary, I suspect a lot of people get it wrong and shoot themselves in the foot. With robots.txt or noindex (both together don’t make sense) at least you see the effect in testing tools.”
There are Googlers who would prefer to remove the tool (some SEOs as well) all together because when you use it Google will listen to it. We are expecting a version 2.0 of this tool at some point that is expected to be “really cool” but that was almost two years ago…
The tool is not in the new (not sure why I am still calling it new) version of Google Search Console, you can access it over here but be careful. Google does have this help document on that tool.
The reason I mostly discourage the URL parameter tool is that it’s out of sight when working on the site. We regularly run into cases where you set something there, forget it, and then wonder why it’s not working. No external tool can notice/flag it. Too much magic, imo.
— 🐐 John 🐐 (@JohnMu) March 14, 2022
Forum discussion at Twitter.