Saturday, December 13, 2008

Robots.txt

If you want that a search engine doesn't index a file or a folder containing files of your website, you must create a file called robot.txt. So, open your Notepad or another editor of txt files and write into the following commands:

User-agent: *
Disallow: /admin/

In the example above we exclude the folder 'admin' from the indexing of all search engines. User-agent represents the search engine and the * means 'all'. After Disallow you must write the folder or the file to exclude from the indexing.

If you want to exclude the folder 'admin' from the indexing of Google only, you must write into the file robot.txt the following commands:

User-agent: googlebot
Disallow: /admin/

If ypu want to exclude more than a folder you must write:

User-agent: googlebot
Disallow: /admin/
Disallow: /log/
Disallow: /stats/

In this case we have excluded the folders: 'admin', 'log', 'stats'.

As last example we want to exclude the file file.html from the indexing of all search engines. We must write:

User-agent: *
Disallow: file.html

No comments:

Bogger

Antok Mashuri