Google: Don’t Dynamically Update Robots.txt File Multiple Times Per Day

Google’s John Mueller said that since the robots.txt file is cached by Google for about 24-hours, it does not make much sense to dynamically update your robots.txt file throughout the day to control controlling. Google won’t necessarily see that you don’t want Google to crawl a page at 7am and then at 9am you do want Google to crawl that page.

Continued here:
Google: Don’t Dynamically Update Robots.txt File Multiple Times Per Day

Comments are closed.