Google: Do Not Use Robots.txt To Block Indexing Of URLs With Parameters

Google’s John Mueller said you should absolutely not “use robots.txt to block indexing of URLs with parameters.” He said if you do that then Google “cannot canonicalize the URLs, and you lose all of the value from links to those pages.” Instead use rel-canonicals and link consistently throughout your site.

Read the original here:
Google: Do Not Use Robots.txt To Block Indexing Of URLs With Parameters

Comments are closed.