Hi, Not sure if this is the right forum for this post, but it seems more appropriate than the others. I have a newish website that is being crawled by Googlebots, and other spiders. They seem to be eating up a lot of my 10000MB bandwidth. From what I've read, the presence of these spiders is no bad thing, as it helps for search results. However, I just fear that I could run out of bandwidth and my site could go down! I know I can reduce the crawl rate or edit my robots.txt file and, hopefully, exclude bots from certain or all areas of the site. I'm wondering that if I take any of the aforementioned actions, will any curtailment affect my search rankings? Any help/suggestions would be most welcome. David.
You may have already done this but if you have a folder just full of images and nothing else, it's useless to the robots so add "Disallow /images" or whatever it is. (can't remember the proper syntax as i havn't done it in a while.)