erosiaart opened this issue on Apr 10, 2008 ยท 23 posts
Rayraz posted Sat, 12 April 2008 at 6:02 PM
robots.txt files are used to control search engine indexing.
Basically you can tell bots where to and where not to search. These robots.txt files are a web standard and most major search engines support it.
Malicious bots are built to be malicious, thus will probably ignore any standard that stands in its way. A malicious bot, like any other, is a piece of software. Software does only what it is built to do. If its not built to user robots.txt files, it simply wont do anything with them.
Btw, I wouldnt recommend blocking your entire site from bots by means of a robots.txt, as it will just mean google and the likes wont index your site, which makes it much harder for people to find.
I dont think there is much you can do if the chinese government decides to firewall your site. Its a corrupt government deciding entirely on its own terms what it does or does not like.
(_/)
(='.'=)
(")(")This is Bunny. Copy and paste bunny into your
signature to help him gain world domination.