# # Defines what webcrawlers, spiders, etc, get access, and to where. # Also stops that anonying error message 'robots.txt' not found.... # # Define No Restrictions to any of them. #User-agent: * #Disallow: