Unless you have unlimited bandwidth on your hosting account, it may be worth considering how often you download and update your feeds when using google sitemaps
For the last month, i downloaded all feeds (200,000 products) every other day, this in turn caused google to re-visit all those pages as the sitemap was reporting them as modified
The result of this was in 1 months google alone consumed 10gb+ bandwidth, of corse the more products you have the more gogglebot will eat
So, it is important to remember that most feeds are not updated on a regular basis, so i have created 3 download scripts
1, daily
2, weekly
3, monthly
I have placed differnt feeds into different downloads based on how often the feeds are normally updated and still retain the ability to manually download and import a file
now googlebot is consuming a more reasonable amount of bandwith, leaving valuable server resources for real visitors
Thanks for the headsup Searley!