Monitoring for account takeover with the Splunk platform
You work for a financial services company with a large customer base. Recently, you've noticed an increase in account takeover attempts, where unauthorized individuals gain access to customer accounts to conduct fraudulent transactions. These takeovers can happen through phishing attacks, credential stuffing, or exploiting weak passwords.
To protect your customers and the integrity of your services, you need to identify unusual login patterns, changes in user behavior, and other indicators of compromised accounts in real-time.
Account takeover (ATO) occurs when a malicious actor gains unauthorized access to a user's account, often leading to fraud or theft. Common indicators include unusual login behavior, multiple, failed login attempts, rapid credential changes, and anomalous transaction patterns. Fraud teams employ tools like multi-factor authentication (MFA) and anomaly detection to spot suspicious activities. Detecting ATO can be a significant challenge for fraud teams, as threat indicators and attacker tactics constantly evolve.
This article shows you how to use searches in the Splunk platform to create basic detection methods. For advanced techniques leveraging user behavioral analytics to stay ahead of emerging threats, see Monitoring for account takeover with the Splunk App for Behavioral Analytics. If you're an Splunk Enterprise Security user, you might also want to see Monitoring for account takeover with the Splunk App for Fraud Analytics.
- Some commands, parameters, and field names in the searches below might need to be adjusted to match your environment.
- Splunk recommends that customers look into using data models, report acceleration, or summary indexing when searching across hundreds of GBs of events in a single search. The searches provided here are a good starting point, but depending on your data, search time range, and other factors, more can be done to ensure that they scale appropriately.
Data required
- Application data for consumer financial applications
- Authentication data for consumer authentication applications
How to use the Splunk platform for this use case
There are many searches you can run with Splunk software to detect account takeovers. You can detect these attempts using these searches:
- ► Unusual login behavior
-
This search is designed to help you proactively identify potential security threats by detecting logins from unusual locations or devices in consumer financial applications. By analyzing authentication data across your systems, it highlights anomalies that could indicate unauthorized access attempts, such as logins from unfamiliar geographic areas or unrecognized devices. This early detection can help prevent account takeovers and safeguard sensitive financial information.
Procedure
-
Ensure that the correct source types are available for identifying logins. The table below outlines recommended source types, associated fields, and typical data sources for monitoring user authentication across financial applications.
Suggested sourcetype Fields Data sources auth_logs
user, action (login_success, login_failure), ip_address, location, device_info, _time, session_id
- Authentication servers
- LDAP, AD, OAuth providers
- Single Sign-On (SSO) systems
- Okta, DUO, Azure AD
- VPN logs
- Authentication servers
-
Run the following search.
sourcetype=<consumer financial app auth logs> earliest=-30d@d latest=-0d@d | stats values(location) AS known_locations values(device) AS known_devices BY user | rename known_locations AS historical_locations known_devices AS historical_devices | join type=left user [ search sourcetype=<consumer financial app auth logs> earliest=@d | fields user location device _time ip ] | where NOT (mvfind(historical_locations, location) >= 0 AND mvfind(historical_devices, device) >= 0) | table user, location, device, _time, ip
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation sourcetype=<consumer financial app auth logs>
Search your consumer financial app auth logs data.
earliest=-30d@d latest=-0d@d
Sets the time range from 30 days ago up to the start of today.
| stats values(location) AS known_locations values(device) AS known_devices BY user
Aggregates unique values for location and device per user.
| rename known_locations AS historical_locations known_devices AS historical_devices
Rename fields for clarity in subsequent steps.
| join type=left user
[
search sourcetype=<consumer financial app auth logs> earliest=@d
| fields user location device _time ip
]
Merge historical data with today's logins based on user.
| where NOT
Filter for logins where either the location or device is unusual.
(mvfind(historical_locations, location) >= 0
Check if the current location exists in
historical_locations
.mvfind(historical_devices, device) >= 0)
Check if the current device exists in
historical_devices
. -
- ► Rapid credential changes
-
This search is designed to help you proactively identify patterns of rapid credential changes within consumer financial applications. Rapid changes might signal unauthorized access attempts or account takeover, so early detection can help prevent account compromise and safeguard sensitive financial information.
Procedure
-
Ensure that the correct source types are available for identifying credential changes. The table below outlines recommended source types, associated fields, and typical data sources for monitoring credential changes across financial applications.
Suggested Sourcetype
Fields
Sources
credential_changes
user, action (password_reset, profile_update, mfa_enabled), _time, initiated_by, ip_address, device_info
- Identity and Access Management (IAM) systems
- Okta, Azure AD
- User Management portals
- SNOW
- Security Tools
- MFA Tools
- Identity and Access Management (IAM) systems
- Run the following search.
sourcetype=<consumer financial app auth logs> action IN ("password_reset", "profile_update") | where _time >= relative_time(now(), "-24h@h") | stats count AS change_count earliest(_time) AS first_change latest(_time) AS last_change BY user | where change_count > 3 | table user, change_count, first_change, last_change
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation sourcetype=<consumer financial app auth logs>
Search your consumer financial app auth logs data.
action IN ("password_reset", "profile_update")
Filter events to only include credential change actions.
| where _time >= relative_time(now(), "-24h@h")
Ensure only events from the last 24 hours are considered.
| stats count AS change_count
Count the number of credential changes per user.
earliest(_time) AS first_change
Capture the timestamp of the first credential change.
latest(_time) AS last_change
Capture the timestamp of the most recent credential change.
BY user
Group the statistics by each user.
| where change_count > 3
Apply the threshold for rapid changes.
| table user, change_count, first_change, last_change
Select and format the specified fields for output.
-
- ► Multiple failed login attempts
-
This search is designed to help you monitor failed login attempts, helping you to identify security risks such as brute-force attacks or compromised accounts. By identifying users or IP addresses with an unusual number of failed login attempts, you can quickly address potential threats and take preventative actions.
Procedure
-
Ensure that the correct source types are available for identifying logins. The table below outlines recommended source types, associated fields, and typical data sources for monitoring user authentication across financial applications.
Suggested sourcetype Fields Data sources auth_logs
user, action (login_failure), ip_address, location, device_info, _time, error_code, error_message.
- Authentication servers
- LDAP, AD, OAuth providers
- SSO systems
- Okta, DUO, Azure AD
- VPN logs
- Failed connection attempts
- Application servers
- Login failure events
- Authentication servers
-
Run the following search.
sourcetype=<consumer financial app auth logs> action="login_failure" | eval _time = _time | bin _time span=15m | stats count AS failed_attempts BY user, ip, _time | where failed_attempts > 5 | sort -failed_attempts | table _time, user, ip, failed_attempts
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation sourcetype=<consumer financial app auth logs>
Search your consumer financial app auth logs data.
action="login_failure"
Filter events to include only failed login attempts.
| eval _time = _time
Create a
new _time
field.| bin _time span=15m
Group events into 15-minute intervals to aggregate failed attempts within each window.
| stats count AS failed_attempts
Count the number of events (failed attempts) and label it as
failed_attempts
.BY user, ip, _time
Group the count by
user
,ip
, and thebinned _time
.| where failed_attempts > 5
Apply the threshold to filter relevant events.
| sort -failed_attempts
Sort the results with the highest number of failed attempts first.
| table _time, user, ip, failed_attempts
Display the specified fields in a tabular format.
-
- ► Unusual transaction patterns
-
Monitoring for abnormal transaction activity is critical in detecting potential fraud or account compromise, especially following user logins. This search allows you to identify suspicious transactions by comparing recent financial activities against a user's historical transaction behavior. To do this, you'll follow a number of steps:
- Ensure that the correct source types are available for identifying unusual transaction patterns.
- Define variables for time frames.
- Aggregate historical transaction data (last 30 days, excluding yesterday).
- Retrieve recent transactions (last 24 hours).
- Calculate deviation metrics.
- Identify unusual transactions based on defined criteria.
By analyzing factors such as transaction amount, payees, and locations, and calculating deviations from established patterns, you can quickly highlight irregularities.
Procedure
-
Ensure that the correct source types are available for identifying unusual transaction patterns. The table below outlines recommended source types, associated fields, and typical data sources for monitoring transaction patterns across financial applications.
Suggested source type
Fields
Sources
transaction_logs
user, transaction_id, transaction_amount, transaction_type, payee, location, device_info, _time, ip_address, status (completed, pending, failed)
- Payment gateways
- Stripe, PayPal, Clover
- Banking systems
- Core banking platforms
- E-commerce platforms
- Shopify, Magento
- Point of Sale (POS) systems
- Retail transaction systems
- Internal Financial Systems
- Enterprise Resource Planning (ERP) systems
- Payment gateways
- Run the following search to define variables for time frames.
| eval historical_start = relative_time(now(), "-30d@d") | eval historical_end = relative_time(now(), "-1d@d") | eval recent_start = relative_time(now(), "-24h@h")
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | eval historical_start = relative_time(now(), "-30d@d")
Mark the start of the historical data (30 days ago).
| eval historical_end = relative_time(now(), "-1d@d")
Mark the end of the historical data (1 day ago).
| eval recent_start = relative_time(now(), "-24h@h")
Mark the start of recent transactions (last 24 hours).
- Run the following search to aggregate historical transaction data for the last 30 days, excluding yesterday.
| search sourcetype=<consumer financial app transaction logs> _time >= historical_start AND _time <= historical_end | stats avg(transaction_amount) AS avg_amount, stdev(transaction_amount) AS stdev_amount, count AS total_transactions, dc(payee) AS unique_payees, values(payee) AS known_payees, values(location) AS known_locations BY user
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | search sourcetype=<consumer financial app transaction logs> _time >= historical_start AND _time <= historical_end
Compute baseline metrics for each user over the past 30 days.
| stats avg(transaction_amount) AS avg_amount
Calculate the average transaction amount per user.
stdev(transaction_amount) AS stdev_amount
Determine the standard deviation to understand variability.
count AS total_transactions
Count total transactions per user.
dc(payee) AS unique_payees
Count distinct payees per user.
values(payee) AS known_payees
List all unique payees per user.
values(location) AS known_locations
List all unique locations per user.
BY user
Group statistics by each user.
- Run the following search to retrieve recent transactions over the past 24 hours.
| join type=left user [ search sourcetype=<consumer financial app transaction logs> _time >= recent_start | fields user transaction_amount transaction_type payee location device _time ip ]
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | join type=left user
Perform a left join on the user field to combine historical and recent data.
search sourcetype=<consumer financial app transaction logs> _time >= recent_start
Fetch all transactions from the last 24 hours and merge them with historical data.
| fields user transaction_amount transaction_type payee location device _time ip
Select transaction amount, transaction type, payee, location, device, time, and IP.
- Run the following search to calculate deviation metrics.
| eval amount_deviation = (transaction_amount - avg_amount) / stdev_amount | eval is_high_amount = if(transaction_amount > (avg_amount + 3 * stdev_amount), 1, 0) | eval is_low_amount = if(transaction_amount < (avg_amount - 3 * stdev_amount), 1, 0) | eval is_new_payee = if(match(known_payees, payee), 0, 1) | eval is_new_location = if(match(known_locations, location), 0, 1) | eval transaction_count = total_transactions
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | eval amount_deviation = (transaction_amount - avg_amount) / stdev_amount
Normalize the transaction amount based on historical mean and standard deviation.
| eval is_high_amount = if(transaction_amount > (avg_amount + 3 * stdev_amount), 1, 0)
Flag transactions significantly higher than the average (e.g. >3σ).
| eval is_low_amount = if(transaction_amount < (avg_amount - 3 * stdev_amount), 1, 0)
Flag transactions significantly lower than the average (e.g. <-3σ).
| eval is_new_payee = if(match(known_payees, payee), 0, 1)
Flag transactions to new payees not seen in historical data.
| eval is_new_location = if(match(known_locations, location), 0, 1)
Flag transactions from new locations not seen in historical data.
| eval transaction_count = total_transactions
Retain the total number of historical transactions for potential further analysis.
- Run the following search to identify unusual transactions based on defined criteria.
| where is_high_amount=1 OR is_low_amount=1 OR is_new_payee=1 OR is_new_location=1
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | where is_high_amount=1 OR is_low_amount=1 OR is_new_payee=1 OR is_new_location=1
Apply the filter to retain only unusual transactions.
- Run the following search to display relevant fields.
| table user, transaction_amount, avg_amount, stdev_amount, amount_deviation, is_high_amount, is_low_amount, payee, is_new_payee, location, is_new_location, _time, ip, device | sort -transaction_amount
Search explanation
The table provides an explanation of what each part of this search achieves. You can adjust this query based on the specifics of your environment.
Splunk Search Explanation | table user, transaction_amount, avg_amount, stdev_amount, amount_deviation, is_high_amount, is_low_amount, payee, is_new_payee, location, is_new_location, _time, ip, device
Select and format the specified fields.
| sort -transaction_amount
Order the results by transaction amount in descending order, highlighting the largest anomalies first.
Next steps
Use the results of these searches to make recommendations to the rest of the security team about which accounts should be investigated for potential account takeover. Create reports based on these searches and schedule them to run at a regular cadence as needed. Be sure to follow any industry policies and regulations that are required for compliance.
For advanced techniques leveraging user behavioral analytics to stay ahead of emerging threats, see Monitoring for account takeover with the Splunk App for Behavioral Analytics.
To further advance your use cases, the Splunk Essentials for the Financial Services Industry app helps you automate the searches to detect financial crime. The app also provides more insight on how searches can be applied in your environment, how they work, the difficulty level, and what data can be valuable to run them successfully.
The Splunk App for Fraud Analytics provides Splunk Enterprise Security users a number of other fraud detection solutions for financial services such as account takeover and new account abuse.
If you have questions about monitoring for account takeover in your environment, you can reach out to your Splunk account team or representative for comprehensive advice and assistance. You can contact your account team through the Contact Us page. For more in-depth support, consult Splunk On-Demand Services to access credit-based services for direct access to Splunk technical consultants with a variety of technical services from a pre-defined catalog. Most customers have OnDemand Services per their license support plan. Engage the ODS team at ondemand@splunk.com if you would like assistance.
In addition, these resources might help you understand and implement this guidance:
- Use Case Explorer: Risk-based alerting
- Use case: Monitoring consumer bank accounts to maintain compliance
- Use case: Detecting credit card fraud
- Use case: Detecting wire transfer fraud
- Use case: Investigating interesting behavior patterns with risk-based alerting
- Use case: Monitoring new logins to financial applications
- Use case: Using modern methods of detecting financial crime
- Use case: Detecting multiple account login denials followed by authorization