Billy! Established Member โ 20 โ Impact 12 May 5, 2007 890 views 6 replies #1 Stupid question maybe? But what do I have to put into a robot.txt file thingy for my VB Forums to keep the spiders happy and to get more visitors to my sites?
Stupid question maybe? But what do I have to put into a robot.txt file thingy for my VB Forums to keep the spiders happy and to get more visitors to my sites?
Nathan Carpe DiemVIP Member VIP โ 15 โ Impact 61 May 5, 2007 #2 Not exactly necessary, vB does an ok job with this, but I still recommend adding one, as it takes like 4 seconds to do :P Here is a guide on robots.txt: http://rsforums.org/forums/showthread.php?t=2544 It includes the text that you want to put in a robots.txt file for vB as well, so you can just copy and paste that :P
Not exactly necessary, vB does an ok job with this, but I still recommend adding one, as it takes like 4 seconds to do :P Here is a guide on robots.txt: http://rsforums.org/forums/showthread.php?t=2544 It includes the text that you want to put in a robots.txt file for vB as well, so you can just copy and paste that :P
Billy! Established Member โ 20 โ Impact 12 May 5, 2007 1 point #3 Thanks so much Matt, that has help me greatly ... I did the last one!
A ag9 Established Member โ 15 โ Impact 5 May 11, 2007 #5 You can point out sitemap path in the robots.txt. Sitemap: http://www.example.com/sitemap.xml This will help your site to be more indexed by search engines. Last edited: May 11, 2007
You can point out sitemap path in the robots.txt. Sitemap: http://www.example.com/sitemap.xml This will help your site to be more indexed by search engines.
Keral_Patel I'll do itRestricted (Chatroom) Impact 1,449 May 12, 2007 #6 I have always made a blank robots.txt and used NoFollow tag to hide the duplicate content from the search engines. Just yesterday I figured out that one of the sites of my friend had a problem in his robots.txt and he had accidentally disallowed google bot.
I have always made a blank robots.txt and used NoFollow tag to hide the duplicate content from the search engines. Just yesterday I figured out that one of the sites of my friend had a problem in his robots.txt and he had accidentally disallowed google bot.
sufi VIP Member VIP โ 20 โ Impact 129 May 12, 2007 #7 I use this tool to generate robots.txt files for my websites. I mainly do it to block images directory to save bandwidth.
I use this tool to generate robots.txt files for my websites. I mainly do it to block images directory to save bandwidth.