There are frequent situations when the bots create a high load on the site, thereby consuming unnecessary resources CPU and RAM, and this can lead to a slowdown of the site. There are useful bots, such as search engines, and there are useless bots that scan the site for example to collect some data or find vulnerabilities in the site.

And sometimes there is a desire to block access to such useless bots.

This can be done in the file .htaccess, if you already have this file in the root of the site, then just add entries, if this file does not exist, just create it in the root of your site.

You can block bots by the following construction:

Deny from 192.168.0.100

With this entry you block access for all requests to the site with IP 192.168.0.100

There is also a need to close access to the site for all, but leave access only for your IP, in this case the entry will look like this:

Order Deny,Allow 
Deny from all
Allow from 192.168.0.100

If the bot has many IPs and they are constantly changing you can block it by user-agent value:

BrowserMatchNoCase "SemrushBot" bots
BrowserMatchNoCase "AhrefsBot" bots
BrowserMatchNoCase "MJ12bot" bots
Order Allow,Deny
Allow from all
Deny from env=bots

Here is an example of how to block all POST requests while leaving GET requests:

RewriteCond %{REQUEST_METHOD} POST
RewriteRule .* – [F,L]

 Hosting
Total 0 Votes:
0

Tell us how can we improve this post?

+ = Verify Human or Spambot ?