What Is the Bot Traffic, How to Detect Bots and Protect Your Project From Them

Mari Kuznetsova Mari Kuznetsova
Updated:
Reading time:  8  min.
3163
0

If you own a website, you might have already wondered if all your visitors are human. With the advances in technology, it is no surprise that bots can account for over 40% of website traffic. In this case, it is important to understand whether this bot traffic is beneficial or malicious. Essentially, bot traffic is any non-human traffic to a blog or website. Bot traffic can be good or bad for the owners of the website depending on the purpose of the bots. Some bots can be useful (for example, search engine crawlers and digital assistants), while others might be dangerous and want to steal credentials or launch attacks on your website. This is exactly why you need to understand what kind of bot traffic you are getting and how to build an efficient bot traffic management strategy.
In this post you will learn about different types of bots, the consequences of their activity on the site, and find out how to detect bot traffic and reduce the number of bots on your site.

The article was prepared together with the developer Antibot.cloud

Types of Bots and Consequences of Their Activity

Access to the site can be divided into three groups:

  • Real people logged into the browser.
  • Useful bots (e.g., search engine bots).
  • “Bad” bots that can cause damage to your site.

Website owners don’t even know how many bots visit the pages of their resources. According to Antibot, the percentage of people may be only 0.5% of the total number of bots. Surely you’ve observed spikes of visits as well as visits from unusual locations or from strange devices, especially while studying Google Analytics reports. Most likely, bots were responsible for these spikes. For example, in 2021, we noticed a traffic increase on the Travelpayouts blog from low screen resolution devices. This is one of the many manifestations of bot traffic.

bot traffic on the site

Bots can be divided into two categories:

  • Bots that perform specific actions on the site.
  • Bots that go from one site to another. For example, click on the partner link on the partner’s website and go to the brand resource.

The purpose of such bots and the consequences of their activity may differ.

Bot Traffic on the Site

Search Bots (Crawlers)

Search bots are the bots of search engines. They search for new pages on the web, scan those pages, and help index them in search.

These robots are useful for SEO and don’t cause any problems.

SEO Bots

SEO bots are owned by SEO platforms (such as Ahrefs, SEMrush, Serpstat, and others) that help their users analyze competitors’ resources. Website owners can use such data to improve their search positions.

Consequences:

The only downside of such bots is how they slow down site loading. If you’re not a user of SEO platforms, you can block them.

Bots for Negative Cheating Behavioral Factors

Bots for cheating behavioral factors are a “grey” method of promoting one’s site in the search results. On a commissioned site, these bots simulate visits from real people. For example, they can browse different pages over a long period of time to reduce the bounce rate.

To boost the commissioned resource to the top, bots worsen behavioral factors on sites that occupy the first lines of the search results. Here, on the contrary, they spend little time on the pages and increase the rate of failures.

Consequences:

Your resource can become one that bots are trying to remove from the top search positions. As a result, your site may end up on the back pages of the search.

Scraper Bots

Scrapers collect important data from pages, such as addresses or other contact details. In some cases, parsers steal content.

Consequences:

Your content may be stolen and published on other resources. In such cases, articles and images become less unique and, because of this, the site may lose its position in the search results.

Vulnerability Scanners

These bots analyze millions of sites and search them for vulnerabilities. Some of them report problems to the owner, while others use this information in their favor.

Consequences:

Transmit the collected information to a third party, which may sell the data or use it to hack sites.

Download Bots

Download bots are created to download materials from your site. For example, free guides or helpful checklists that website owners use as lead magnets in their marketing strategies.

Consequences:

Give the wrong impression about the efficiency of the materials. Suppose that you have partner links in your guide. You will likely notice a big number of guide downloads, but the number of clicks on links will not match your average conversion. Then you will most likely want to correct the guide or change the marketing funnel. However, if the number of downloads is overtaken by bots, your multi-day work is unlikely to improve in terms of the conversion rate.

Spam Bots

Spam bots fill in contact forms, send phishing emails, and create meaningless or advertising-related comments on websites and social networks.

Consequences:

Spam bots quickly get bored and can negatively affect the site reputation. You should constantly read emails from them or moderate comments.

Bot Traffic From the Site

Click Bots

Click bots mimic user behavior and engage in fraud, such as clicking on advertisements, partner links, and other tools.

Consequences:

If you have a pay-per-click ad on your site, such as AdSense banners or partner links from CPC (cost per click) affiliate programs, you will be banned for click bots. Click bots distort the statistics on partner CPA (cost per action) programs, because they generate many impressions and clicks on partner tools, but no sales.

How to Detect Bot Traffic

1. Review Data From Analytics Systems

The project may exist for years with bots, but the negative effects may not manifest immediately or may be less noticeable. For example, because bot traffic will gradually decrease over time, the position of the site pages will slow down the development of the project.

The number of unnecessary bots increases with the growth of the site. As the resource becomes more trustworthy, the number of pages and backlinks grows in addition to the number of bots. Therefore, the primary task of the website owner is to detect malicious bot traffic.

You can detect the number of bots by server logs: access.log and so on. You can also spot the appearance of bots using Google Analytics. Pay attention to spikes in page views, high bounce rates, unusual session lengths, and visits from strange locations.

2. Learn Partner Program Statistics Without Bots

If you earn through affiliate marketing, keep in mind that visits from all types bots are collected in the statistics of partner programs. This distorts the picture of conversion, as it’s difficult to estimate how many real people are interested in your content and are clicking on your partner tools. Because of the inaccurate statistics, you might make the wrong decision about working with certain brands. For instance, you might think that you attract a lot of traffic, but you won’t end up getting a high enough reward for your work.

To make our partner statistics more accurate, we at Travelpayouts made a unique anti-bot filter. It automatically excludes bot traffic from program reports. You can learn more about how the Travelpayouts feature works in our Help Center.

3. Introduce CAPTCHA

CAPTCHA is a test that website owners can use to differentiate human users from bots. This test is easy for humans to complete in order to continue using a website, while bots usually cannot finish the necessary tasks. For example, CAPTCHA is often used before the user signs up for a newsletter to ensure that the website only gathers the email addresses of actual people.

However, as bots are getting smarter, CAPTCHA might not protect your website in all cases. Furthermore, it can make the user experience less enjoyable and boost your bounce rate. Therefore, CAPTCHA can be a good addition to your bot traffic management strategy, but it is not the only solution that can be introduced.

4. Help Good Bots Crawl More Efficiently

Depending on how big your website is, not all pages might need to be crawled. You can block the access of bots with the help of robots.txt. This file informs search engines which pages on your website can be crawled, but does not hide your site from Google and other engines.

Using robot.txt can save you time and energy while optimizing your site crawling. For example, you can hide internal search results pages or staging content. This will help crawlers focus on what is important and discover your key content faster. You can also remove useless links created by your plugins or blogging platform.

5. Limit the Crawl Rate

You can also limit the crawl rate by using robots.txt, so that bots will not check your site and links too often. This can be a great idea for medium and large websites, which crawlers often scan. If you do not change your content multiple times during a day, setting up a crawl delay will allow bots to avoid wasting too much effort by crawling the same page over and over. Thus, search engines can identify your newest pages much faster because they will not have to go through all your content again.

Change your crawl delay rate and track the results in your analytics software. You can start with a small delay and increase it if there are no negative results. Note that not all crawlers allow users to set a delay, but this still can be a good solution if available.

How to Protect Your Project From Bot Traffic and When to Start?

It is important to regularly check your traffic for malicious bots and be ready at any time to being protecting yourself. Even if you don’t notice the negative effects yet, try to calculate the amount of bot traffic. If bots account for more than 25% of all your traffic, there is a risk of having problems on Google. This is when you need to urgently begin to protect your website.

You should understand that it is impossible to get rid of 100% of bots. However, there are simple services that will help you to reduce their number. These services will save your content, improve your positions in the search results, as well as protect your advertising and links from click-fraud.

CloudFlare

CloudFlare protects approximately 30% of all sites.

Advantages:

  • Caches pictures, which reduces the load on the server. 
  • If you configure three simple rules, you will receive excellent protection from many bots that do not support HTTP2. These three rules are: primitive content parsers, vulnerability checkers, and some spammers. 
  • Protects you against DDoS attacks.
  • All of this is available with the free plan. Even paid DDoS attack protection services generally have no significant advantages over CloudFlare.

Disadvantages:

  • CloudFlare does not protect against cheating behavioral factors.

CloudFlare offers many options. For example, it is free for personal websites and apps up to 50 users. If you need more powerful security features, you can connect paid options that start at $5 per month.

AntiBot 

AntiBot helped solve the problem with bots on the Travelpayouts blog. It was installed in February 2021 and stops all kinds of bots, including those that cheat behavioral factors. AntiBot can be used with CloudFlare.

During the AntiBot check, users see a page similar to the CloudFlare page for 1-3 seconds, then automatically pass to the next page. If users don’t pass the automatic test, which happens in about 5% of cases, they should push the button.

antibot cloud protect test

Advantages:

  1. Easy integration and settings.
  2. Maximum protection against bots masked by humans to improve behavioral factors.
  3. With AntiBot, sites do not get banned and do not change their positions in the search results as a result of bots.
  4. The test is translated into the most popular languages, and you can customize this page.
  5. AntiBot protects advertisements from click-fraud bots and does not interfere with the way these ads operate. Sites with AntiBot successfully undergo moderation in AdSense (when setting up your account: you need to add to the white list of bots on AdSense).
  6. If you do not have experience installing scripts, you can buy the installation and turn-key setting for an additional fee.

Disadvantages:

  1. The sites with AntiBot will not be able to advertise on Google Ads, as it will not be moderated. 
  2. Does not provide protection against DDoS attacks. 

The cost of Antibot depends on the number of domains and varies from $25 to $99 per year. For readers of the Travelpayouts blog, AntiBot has prepared a promo code for 30+ days of the test period on one domain and all of its subdomains. The TP2022 coupon will be valid for registration until September 1, 2023.

How to Detect Bot Traffic in Google Analytics (Step by Step)

To make informed decisions about your business, it is important to get the right data on your performance. Bot traffic can cloud your decision-making and lead to poor efficiency. So, be sure to spot bot traffic and ensure the accuracy of your reports.

How can you identify bot traffic in your Google Analytics account? If you check the session’s graphics and see spikes in traffic that are not related to any particular campaign, event, or promotion, it could be the result of bot traffic.

Here is how you can spot bot traffic in Google Analytics:

  1. In your Google Analytics dashboard, go to the left-hand menu and choose “Acquisition”.
  2. Click on “All Traffic and Channels”.
  3. Find the Default Channel Grouping column and click “Referral”.
  4. Check the full list of your referral sources. If some of them are unknown to you, it might be due to bot traffic.
  5. Check the bounce rate and average visit duration. If bot traffic is present, you might see a 100% bounce rate and a zero average visit duration.

Have you ever detected bots on your site? Share in the comments how these bots affected your project and how you protected your site?