Today is the 20th anniversary of the robots.txt directive being available for webmasters to block search engines from crawling their pages. The robots.txt was created by Martijn Koster in 1994 while he was working at Nexor after having issues with crawlers hitting his sites too hard.