# robots.txt # # This file is to prevent the crawling and indexing of certain parts # of your site by web crawlers and spiders run by sites like Yahoo! # and Google. By telling these "robots" where not to go on your site, # you save bandwidth and server resources. # # This file will be ignored unless it is at the root of your host: # Used: http://example.com/robots.txt # Ignored: http://example.com/site/robots.txt # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html # Prevent Slurp from recursing on SSI-files # http://www.inktomi.com/slurp.html User-agent: Slurp Disallow: /~mikaelb/ User-agent: Googlebot Disallow: /~kenyata/kristofferp.com/ # Prevent all others from recursing on certain popular SSI-files User-agent: * Disallow: /cgi-bin/j-e/ Disallow: /ssh # MediaWiki, exclude since it is behind login wall User-agent: * Disallow: /wiki User-agent: * Disallow: /~mikaelb/