New Feature: Multifunctional Firewall

Hey everyone!

It’s time for another new feature. This time I’d like to introduce our multifunctional firewall. The way it works is that it lets you block different web browsers/user-agents based on custom patterns, IP addresses, IP ranges and verified bots that are processed by our bot detector. If you decide to block, let’s say Firefox, then visitors who use Firefox web browser will be shown a 403 Forbidden page. You are able to create blacklist and whitelist rules.

Every single request to your domains goes through our bot detector. Our bot detector is able to identify following bots by name: Googlebot, Bingbot, Ahrefsbot, Semrushbot, Majestic bot, Yandexbot, Baidubot, MOZ bot (DotBot, Rogerbot, Ezooms), BLEXbot (LinkAssistant), Archive.org crawler, SearchMetrics crawler, SEOkicks bot, Sistrix bot, Lipperhey spider, BacklinkTest crawler, PagesInventory bot, OpenLinkProfiler bot, Linkdex bot, IBM Watson bot, Blekko bot, CognitiveSEO bot, Linkpad.ru bot, SEOZoom bot (as of 16/01/2019, this list will grow over time). Our bot detector database contains a very large amount of entries and all other bots are identified as CRAWLER.

What are blacklist and whitelist rules?

It’s very simple. Blacklist rule will block bots/IPs that you defined and will let everyone else through. A whitelist rule is the opposite to blacklist rule. A whitelist rule will allow traffic only from bots and IPs that you define and will block everyone else. In most cases, you should use blacklist rules because if you accidentally use a whitelist rule and forgot to include Googlebot, you can get your websites deindexed. So be very careful!

How do you know Googlebot is actually Googlebot?

Some bots like Googlebot, Bingbot, Baidubot and Yandexbot get additional verification which includes ISP check, RDNS check and some other checks. This way we can be sure that those bots are actually what they claim to be.

Can you guarantee that Ahrefs and Semrush bots get blocked?

Unfortunately, we cannot guarantee that because of some bots like Ahrefs and Semrush bots that are known to use different IP ranges and user-agents to bypass blocks. So we cannot guarantee 100% success blocking them.

Does your firewall work for cached websites as well?

Yes, of course. Our firewall works on the same level as the caching so this means first the request goes through our bot detector and then our system decides if it should serve a cached page or shows a live page.

Do I have to set up rules for every single website I have?

No, there’s no need to do it. You can create just 1 rule and apply it to all of your domains. But if you add new domains then don’t forget to include them in your firewall rule.

Where can I set up these rules?

You can find the firewall page under “IP Network” menu when you click on “Network Firewall”.

Setting up these rules is difficult, is there a simple way to create rules?

Sure! We offer 2 different forms. The first one is simple form that allows you to select just bots that our bot detector detects by name. If you know what you are doing or have specific bots that you wish to block or allow through then you can use the advanced editor. It will let you enter IPs. IP ranges, custom patterns for user-agents and of course select bots that our bot detector can identify by name.

That’s it for now. If you have any questions, feedback or need help setting up your first rule, contact us our support and they can help you out.

Like what you see? Share with a friend
Priority Prospect

Priority Prospect

This account is utilized to share insightful content about succeeding and thriving within the SEO industry, with a particular focus on strategies for success with Priority Prospect.

Read more