Granular bot identification surfaces good and bad bots
Web bots and scrapers are identified by the volume of requests made, the type of content scraped, and user agent information. They are defined by four primary categories based upon their desirability and aggressiveness:
- Highly desired with low aggression
- Low desirability and highly aggressive
- Highly desired and highly aggressive
- Low desirability with low aggression
Refined categorization leads to specific actions
Flexible categorization enables Bot Manager to assign actions based on 15 different Akamai-defined categories of web and business services. Additionally, customer-defined categories allow organizations to create custom categories for their unique situations. Once categorized, Bot Manager assigns actions for handling good bots, such as search engines, and bad bots, such as those stealing site content.
Categorization enables Bot Manager to better manage search-engine traffic and partner bots and prioritize third-party services – all of which are good for the business – while minimizing the potentially harmful business impact of content aggregators and grey marketers, along with the damaging impact of spam and web scraping.
Superior capabilities lead to better response
Bot Manager’s key capabilities ensure a more refined response to bot traffic:
- Akamai-known bots — Akamai continuously updates a directory of known bots based on recent interactions with other Akamai customers using its Cloud Security Intelligence (CSI) data analysis engine.
- Customer-known bots — Bot Manager allows organizations to create custom signatures to identify known bots that regularly interact with their site and assign specific actions to be taken.
- Detection of unknown bots — Bot Manager can detect traffic from unknown bots using a variety of characteristics such as request rate, request characteristics, bot behavior, and workflow validation.
- Business-oriented policies — Bot Manager enables organizations to categorize different types of bots and create management policies that define how traffic from different categories will be handled based on their business impact.
- Advanced actions — Bot Manager provides a range of actions that can be applied to different bot categories, such as monitor, deny, delay, and more.
- Reporting and visualization — The Security Center dashboard provides real-time visibility into bot traffic for the website as well as the ability to drill down into the activities of individual bots or categories of bots. Bot Manager provides preconfigured reports including bot trending, top bots, and bot activity.
- Logging — You can increase your threat posture awareness by integrating event logs with your security information and event management (SIEM) or other reporting solution through Akamai’s Log Delivery Service (LDS).
Actions hidden from bot operators
Bot Manager combines detection and management in a way that doesn’t alert bot operators. Although bot traffic is continuously being served, blocking bots doesn’t make the problem go away; it actually makes the problem worse because once blocking action is detected by the operator, the bots may return better hidden.
Visualization and reporting drive better site traffic management policies
To better understand bot traffic and response actions, Bot Manager provides a wealth of information through its reporting functionality which includes:
- Security Center – an integrated dashboard for monitoring bot activity and other security events.
- Bot Activity Report – an overview of bot activity over predefined or customized time periods.
- Bot Analysis Report – a detailed analysis of specific bot activity with sample traffic logs.