Scraper Bot Protection

Lifeboat can set up the nginx Ultimate Bad Bot Blocker to protect the server from malicious bots and scrapers. Requests that match rules built into this system are dropped by nginx. This view also allows configuration of custom whitelist and blacklist rules.

Protect server from bad robots checkbox toggles the whole system on and off.

Update blocklists automatically will create or delete a "bot-blocker-updates" script in the cron.daily folder. Note that this script will appear in the Lifeboat Cron Jobs Daily list.

Custom Rules

✅ or ◯ Allow Rule - Matches will be allowed to use nginx
❌ or ╳ Deny Rule - Matches will cause nginx to drop the connection

Click the symbol in the list to toggle rules between Allow and Deny access.

The "Look in" column is the parameter in which to search for a specific rule value. The "haystack". Search against the IP Address, the HTTP Referrer field, or the HTTP User-Agent header. Click this field to open a menu for selecting the search parameter.

The "For value" column is the "needle" value which will be used to match a rule. Except IP Addresses, values are passed to nginx enclosed in quotes. To make a search pattern a regular expression, prefix it with ~*. Read more the nginx documentation. IP Addresses can use CIDR Block notation. Click once and wait a moment to edit the field.

Making any changes to rules will enable the "Save" button. Changes are not applied until the "Save" button is clicked. Navigating away will lose any unsaved changes.