16 May 2012
More than 50 percent of the overall traffic sustained by small and mid-size websites is generated by bots and automated scripts, according to a study by traffic monitoring company Incapsula.
Although the presence of bots and crawlers is common knowledge and is usually desired as these automata often index and rank pages on behalf of search engines, the report reveals that more than 30% of the automated traffic is malicious.
Since crawlers and bots don’t count as legit sources of revenue for the website owner, the generated traffic, usually paid by the website’s owner, is a waste of resources. More than that, this artificial traffic poses a serious threat to the server’s security, especially in shared hosting environment. According to Incapsula, every site is visited by a similar number of bots, regardless of size. As most shared webhosting providers tend to cram tens or hundreds of websites on a single server, the amount of bot-generated traffic can lead to bandwidth overload and, in some cases, to a distributed denial-of-service.
Performance degradation is another reason to worry: the website hosting company wastes over 50% of their server capacity, while the average page load increases more than 50%. And, if legit bots usually aid ranking in search engines, the massive parasitic traffic impacts on SEO rankings as well.
The research sheds new light on the cost of malicious traffic. If, up until now, the cost of cyber-crime was a matter of successfully breaching the server and leaving with the spoils, the analysis of malicious traffic reveals that web business owners can actually lose money by simply allowing bots to crawl the site.