How to Keep Search Engines from Indexing Your Site
If you have a development site or page that you do not want Search Engines to index you can note that through the use of a robots.txt file. A robots.txt file will tell robots not to visit any pages on a site.
Blocking All Robots from the Entire Site
1. Create a text file named robots.txt
2. Type the following into the file:
User-agent: *
Disallow: /
3. Save the text file and place it in the root of your site files in the httpdocs folder. See
How to Use FTP.
This file is honored by legitimate search engines but may be ignored by code design with malicious intent. For more information on robots.txt files you can visit this URL:
http://www.robotstxt.org/robotstxt.html.
Article ID: 1844, Created: February 28, 2013 at 4:47 PM, Modified: March 3, 2016 at 4:22 PM