The site that the google bots are looking up is HUGE- dozens of pages with hundreds of links. Google appears to be crawling the site at 5 minute intervals and all of this indexing (or whatever its doing) is taking up resources nad costing big money. I am afraid if I put a robots.txt file that the site will not come up as high on the google search page.
Any help or info? I have found plenty of info that tells me "this is what a web crawler is" but nothing goes past that.
There are a number of options there and one of them is 'Tools' and under that 'Crawl Rate'. Click on that and you will get an option to vary the crawl rate of your site. If you choose slower, it will reduce the load on your webserver.