Google: Robots.txt Files Must Be Smaller Than 500KB

Google’s John Mueller reminds webmasters on his Google+ page that Google has a limit of only being able to process up to 500KB of your robots.txt file. This is an important point, if you have a super heavy robots.txt file, and it is beyond 500KB…

Read the rest here:
Google: Robots.txt Files Must Be Smaller Than 500KB

Comments are closed.