The importance of a robots.txt file
All websites should have a robots.txt file. This is essentially a text document that the search engines will read prior to crawling the information on your site. This file should tell them what areas of the site they are and are not allowed (if any) to crawl and therefore index.
A robots.txt file can be useful in a number of ways. If you are in the process of building a website that is not password protected then you need to ensure that you add a no index rule to the robots file in order to stop all search engines listing your site. If the site is indexed and the pages are not complete, not only may you get visitors landing on an unfinished site and leaving (possibly not bothering to return again) but also the search engines may list content that was place holder text. This can also mean that when the site does finally go live, the search engines take longer to crawl and update your content and enhance your search ranking because it previously thought the site content was poor.
Once your site is live, always remember to update this file as otherwise you may find yourself a month down the line with no listings whatsoever and therefore very few visitors.