A robots.txt file is the gatekeeper for your web site or blog that either allows or disallows search engine spiders from indexing your web site. If you don’t have a robots.txt file the spiders and bots make an assumption about how they should access your web site. We want to send clear instructions as to what we want spiders and bots to do.
Assuming you want maximum traffic to your web site you’ll want to allow all spiders and bots access to crawl and index everything on your web site. Sometimes the names of spiders and bots change, so it’s a good idea to allow them all. Some instructions will have a bunch of lines for a robots.txt file listing all the spiders and bots they know of, but again we just want to cover all of them without having to keep modifying the robots.txt file in the future.