How User Behavior Anomaly Detection Can Protect Your Organization
What is User Behavior Anomaly Detection (Analytics)?
User Behavior Anomaly Detection (also known as User Entity Behavior Analytics) is the use of artificial intelligence to learn the normal behavior of users or entities in an organization. The idea behind user behavior anomaly detection is that learning the normal behavior of users or entities in an organization allows cybersecurity teams to detect anomalies in behavior that may indicate a compromised user or entity in the organization. This is needed because as cybersecurity threats become more sophisticated, the act of detecting malicious activity performed by a compromised user or entity may not register in typical cybersecurity solutions. At its core, user behavior anomaly detection solutions are responsible for detecting anomalies and highlighting suspicious user and entity behavior.
The Need for User Behavior Analytics
To understand the need for user behavior analytics, we need to understand modern cybersecurity threats. When people think of modern cybersecurity threats, they typically think of the different kinds of well known malware such as viruses, ransomware, command and control software, and the “typical” cybersecurity threats. While these threats are definitely ones that need to be monitored in any organization, they tend to have well known signatures associated with them, where “signatures” are unique identifiers that are associated with the different malware threats. Because the signatures of these kinds of malware are well known, typical cybersecurity solutions tend to have no problems detecting them, which is great for cybersecurity teams as a whole.
Unfortunately, the cybersecurity attacks that are successful tend to involve activity that, while malicious, don’t exactly register with your typical cybersecurity solutions. For an example, let’s look at social engineering, the largest attack vector of any organization. Social engineering is the idea of hacking people themselves in order to unknowingly get them to reveal sensitive information about an organization or convincing them to do something malicious without their knowing. What’s interesting about these attacks is that when successful, they don’t register as malicious activity to existing security tools. For example, if an attacker convinces an employee to give over a sensitive password and the attacker successfully uses the password to login to a sensitive system, there’s nothing inherently wrong with this activity from a security software perspective. Another example is an employee account is compromised by an attacker and the attacker uses the account to exfiltrate sensitive information or data from the organization. While this kind of activity is obviously bad, it’s another activity that won’t register on your typical cybersecurity tools.
Another example illustrating the need for user behavior analytics would involve the monitoring of systems that don’t allow the installation of security tools on them. There are a lot of systems on networks that don’t allow the installation of security on them, such as end of life systems, network appliances, and mission critical systems that can’t afford installing any security software that could negatively interfere with the systems’ daily operations. Regardless of the reasons why security software can’t be installed on these systems, cybersecurity teams still need to monitor these systems for threats. User behavior analytics solutions shine in these cases because regardless of the system, they all produce logs. If there is a way to establish normal behavior for these systems through these logs, then a user behavior anomaly detection solution can learn them and start performing suspicious behavior detection. This would act as a non-invasive way to implement security on systems that can’t normally implement it.
Finally, having the ability to learn user and entity behavior for your particularly sensitive users or systems can act as an extra layer of cybersecurity protection. More often than not, organizations have specific users or systems that are particularly important for various reasons. This can be a sensitive database, a C suite executive, or something of the like that, if compromised, would wreak havoc on your organization. Being able to learn when these users or systems act outside their normal behavior would be particularly useful for any cybersecurity team.
How User Behavior Analytics Solutions Work
User Behavior Analytics solutions (which we’ll be referring to as UBA or UEBA interchangeably from here on out) perform suspicious behavior detection by using artificial intelligence, specifically unsupervised learning, to establish normal behavior for users and systems and letting users know when behaviors deviate from the established normal. Depending on the implementation of the UBA system, the system will detect anomalies in user or entity behavior and attempt to map the anomalies to threats for threat hunters to validate and investigate.
The usage of artificial intelligence in UEBA solutions is critical, so your mileage with UBA solutions will vary depending on the quality of the artificial intelligence being used. Unfortunately, artificial intelligence can be defined as “any system that appears to make intelligent decisions”. Because its definition is so broad, it’s easy to slap an AI label onto a UEBA solution when it may not be using it at all. Cybersecurity teams owe it to themselves to know the difference between tools that uses actual artificial intelligence (e.g. statistical analysis, neural networks, etc.) and ones that claim they do for marketing purposes. Before buying any UBA solution, here are some questions that you can ask the vendor:
Can you elaborate how this UEBA tool utilizes artificial intelligence?
How does the tool perform user behavior monitoring?
Can you describe how the models are trained?
Do the models use basic statistical algorithms or do they use deep learning algorithms?
How do we feed the UEBA solution data?
How does the UBA product handle suspicious behavior indicators?
Do the models get better as they’re used over time?
The usefulness of UEBA solutions are completely dependent on the data given to it, the amount of data given to it, and the learning algorithms used for it. QFunction recommends that larger organizations use UEBA solutions that utilize deep learning models, as they are more performant on large amounts of data.
About Our User Behavior Anomaly Detection Services
QFunction performs UEBA differently. Because of our experience in typical UBA solutions, we know where they tend to excel and where they tend to falter. Our solution differs from typical UEBA solutions in the following aspects:
QFunction builds custom user behavior anomaly detection solutions directly into your existing environment
QFunction’s solutions learn the normal behavior of specific users or systems in your environment
Instead of classifying our solution as traditional UBA, we classify it more as user behavior anomaly detection. We do this because while there is definite benefit in bringing in an entirely new UEBA product into your organization, a lot of these excel only in specific areas due to the fact that you have to conform your data to their specific solution. While this is useful, this somewhat pigeonholes existing UEBA solutions into only detecting anomalies and threats that they know, which limits them from learning anomalies in more specific parts of your data that may be more useful to your organization, such as Linux systems, networking devices, or specific parts of your user data. Anomaly detection is a lot broader than most people realize, and we use that to better meet the needs of our clients. And because we create the system ourselves, we can perform user behavior monitoring that identifies suspicious behavior indicators in whatever format that fits your environment.
We focus on specific users and entities in your organization instead of analyzing user behavior across your entire organization because we noticed that most teams get overwhelmed with the data that is produced by traditional UEBA solutions. While it’s nice to have a system that can supposedly analyze all your users and systems, the amount of anomalies and potential data that this can generate is overwhelming for most cybersecurity teams. Therefore, you end up with another tool in your environment that produces data that you don’t have the time to investigate. QFunction believes in focusing on what’s important in your organization and determining how to spot suspicious behavior in those specific users and systems.
We realize that QFunction’s user behavior anomaly detection offering is very unique, but with the unique use cases of organizations, we believe that putting artificial intelligence and anomaly detection in the hands of organizations large and small is important for cybersecurity teams moving forward.
Conclusion
As artificial intelligence continues to mature, user behavior anomaly detection solutions will continue to evolve. If you’re interested in implementing this in your environment or want more information, check out our user behavior analytics solutions ! And if you’re interested in how anomaly detection can be performed, check out our post on GANs and autoencoders!