With crawling I mean bandwith for robots for my site in GB per month. Then my increase in articles and comments will indicate the actual size increase of bandwidth. After I calculated this I will write to EU and demand smarter crawling of Google and Bing and only for new or changed material and not constantly the entire site. Webb hosts should automatically ban bad bots. This to decrease bandwith and the need for electricity for servers and decrease costs for smaller sites that dont are aware of this problem. Thanks for your suggestion about memory check.
Statistics: Posted by Slackervaara — Mon Jul 01, 2024 10:32 am