Websites are built to offer information and services for human consumption. However, it is not only people who visit websites. Many automated software programs, also known as bots, also visit websites and sometimes with far greater frequency than humans do. These bots visit the websites with different purposes.
While some good bots like search engine crawlers benefit your website by indexing your pages, some malicious bots can cause appalling amounts of damage. They can compromise user accounts, scrape content and data, generate false reports, and slow down websites by visiting the pages too frequently.
How do you let the good bots in and keep the bad bots out?
Our Bot Defense feature leverages our client-cloud architecture to categorize these bots as either beneficial or malicious. In order to monitor each request sent to your website, we leverage machine learning algorithms to analyze the data that our Nanovisor technology and Web Application Firewall gather. Our unique technology enables us to identify malicious bots, and mitigate any detrimental impact they potentially make before they reach your origin servers.