Saturday, June 12, 2010

How to create Robot.text and htaccess file.

Use of  .htaccess file

The .htaccess file can be used on Apache servers running Linux or Unix to increase your web site security, and customize the way your web site behaves. The main uses of the .htaccess files are to redirect visitors to custom error pages, stop directory listings, ban robots gathering email addresses for spam, ban visitors from certain countries and IP addresses, block visitors from bad referring (warez) sites, protect your web site from hot linking images and bandwidth theft, redirect visitors from a requested page to a new web page, and to password protect directories. Use the information in this article as a starting point to optimize and protect your web site.
More details in htaccess


How to create  robots.txt File

The robots.txt file is a simple ASCII text file used to indicate web site files and directories that should not be indexed. Many webmasters choose not to revise their robots.txt file because they are uncertain how the changes could impact their rankings. However, a poorly written robots.txt file can cause your complete web site to be indexed, gathering information like passwords, email addresses, hidden links, membership areas, and confidential files. A poorly written robots.txt file could also cause portions of your web site to be ignored by the search engines.

More details in Robot.txt