You are here:  Home

Robots.txt

Robots.txt files can be compared to security guards, they basically tell the search engines which part of a website that they can enter and which parts they are not allowed to.

Many people think that this has a negative affect on a website, the reason for this being that to get a good page ranking they believe that search engines should go through every page. This is very true for most websites, but at the same time, a lot of websites use a robots.txt file to keep sensitive information private.

These files are very useful, for example, if a website stores customers personal details, they can create a robots.txt file to tell all the search engines not to go through and index those pages. Another way of looking at it would be that of they didn't exist then search engines could index pages that contained sensitive information of a lot of people.

It is possible to combine a robots.txt file with password protection. The reason that it is considered good to do this is so that you can protect these private pages from computer hackers who would go straight to the files that are marked as do not enter to the search engine spiders.

Metrics we use