Jun 20
2
Google’s John Mueller said on Twitter that using the URL parameter tool is no replacement for using a robots.txt file for blocking content. John was asked “how reliable is it” when setting “crawl no urls” of a certain type of URL pattern. John said “it’s not a replacement for the robots.txt — if you need to be sure that something’s not crawled, then block it properly.”
See original here:
Google: URL Parameters Tool Is Not A Replacement For Robots.txt