Skip to main content

 

Splunk Lantern

Reducing Cisco ASA data volumes with Edge Processor and Ingest Processor

 

Whether you’re filtering out low-priority firewall logs or reducing noisy events, Splunk Edge Processor (customer-hosted) and Splunk Ingest Processor (Splunk-hosted Saas) let you filter, transform, and optimize Cisco ASA logs before routing to the Splunk platform or Amazon S3 for low-cost storage.

With just a few clicks, you can apply pre-built templates to:

  • Reduce unnecessary log ingestion and optimize license usage
  • Improve search performance by focusing only on high-value events
  • Route processed data directly to the Splunk platform for real-time analysis or to Amazon S3 for low-cost, long-term storage (data in S3 can be later accessed using FS-S3 if needed)

You can get started quickly with out-of-the-box templates and preview the results before applying any changes with no custom code required.

The process shown in this article can be applied to both Edge Processor or Ingest Processor, but if you’re just getting started we recommend working with Ingest Processor. For the purposes of this article, we’ll focus on using Ingest Processor.

Data required

Cisco ASA

Prerequisites

  • Verify that you have access to Splunk Data Management and can log in to your tenant (learn more).
  • Connect your tenant to a Splunk Cloud Platform deployment. If you are the first user on your tenant, you will need to complete the first-time setup instructions for Ingest Processor or Edge Processor to connect your tenant. This connection will provide indexes for storing the logs and metrics passing through the processors.

How to use Splunk software for this use case

Watch this video or follow the written steps below to use a pre-built pipeline template to filter out low-priority Cisco ASA logs.

Step 1: Access the pipeline templates

  1. In your Splunk environment (Data Management or SCS environment), navigate to Data Management > Pipelines.
  2. Click the Templates tab.
  3. In the list of available templates, look for "Cisco ASA log reduction". Use the search bar at the top of the template list if you can't see it initially.

clipboard_e9337dbde8d39a25ed87615aa728db91f.png

Step 2: Create a new pipeline using the template

  1. Click New Pipeline.
  2. Choose the Ingest Processor type.
  3. From the list of templates, select Cisco ASA log reduction, then click Next.

clipboard_ee76334be64728a9cf6f48e5fdd15fa1f.png

Step 3: Define your data source

Your Cisco ASA source type will be auto-detected (usually cisco:asa or however you defined it).

Here you can choose how to preview your data. You can use sample logs provided in the Splunk platform, or create a live snapshot of your incoming Cisco ASA data stream (recommended for real-time testing).

Step 4: (Optional) Skip metric destination

If you’re not routing data to a metrics index, skip this step.

Step 5: Choose a destination index

Select the index where you want to store your filtered Cisco ASA logs. You can use an existing index or create a new one beforehand.

Step 6: Preview and understand the pipeline logic

Scroll down to the SPL2 Pipeline Editor to view the logic and customize it, if necessary. You can find the full SPL2 in Step 10 of this process.

This pipeline:

  • Extracts message IDs from Cisco ASA events
  • Drops low-priority logs (based on predefined message IDs)
  • Forwards high-value logs to your selected index

Step 7: Test your pipeline

Click the Preview button to test how the pipeline filters your log data. You’ll see a side-by-side view of raw and processed data that you can compare.

Step 8: Save and apply

  1. Click Save, and give your pipeline a meaningful name (example: cisco_asa_filter_splunk).
  2. When prompted, click Yes to apply the pipeline and begin processing data.

clipboard_e604987d4cb247ae2f527062b4d500654.png

Step 9: Validate the results

  1. Go to Search & Reporting in the Splunk platform.
  2. Run a search on your destination index (example: index=cisco_asa_index).

You’ll notice significantly fewer logs — only the most relevant Cisco ASA messages are being indexed.

clipboard_ec593f341fe1708204df444c345541e71.png

Step 10: (Optional) Customize the template

If you need to make any changes, you can use this SPL2 code to modify the template:

import 'cisco_msg_id.csv' from /envs.splunk.'eps-shw-522513dc5758f0'.lookups
import route from /splunk/ingest/commands
import logs_to_metrics from /splunk/ingest/commands
function extract_useful_fields($source) {
   return | from $source
   /* Extracted message matches with ASA or FTD */
   | rex field=_raw /(?P<_raw>(%ASA|%FTD).*)/
   /* Extract message number */
   | rex field=_raw /(%ASA|%FTD)-\d+-(?P<message_id>\d+)/
   /* Extract username */
   | rex field=_raw /^[^'\n]*'(?P<username>[^']+)/
} function drop_security_noise($source) {
   return | from $source
   | where message_id != "302013"
   | where message_id != "302015"
   | where message_id != "302016"
   | where message_id != "110003"
   | where message_id != "110002"
}  function mask_usernames($source) {
   return | from $source
   | eval _raw=replace(_raw, username, "[NAME_REDACTED]")
}  function enrich_with_explanation($source) {
   return | from $source |
   lookup 'cisco_msg_id.csv' message_id AS message_id OUTPUT explanation AS explanation
}  $pipeline = | from $source
 // extract the useful fields
    | extract_useful_fields
 // Filter "302013", "302015" message ID number
    | drop_security_noise
 // enrich log events with explanations based on message ID
    | enrich_with_explanation
 // convert logs to metrics and send to o11y cloud
     | thru [
        | logs_to_metrics name="cisco_asa" metrictype="counter" value=1 time=_time dimensions={"message_id": message_id}
       | into $metrics_destination
     ]
 // send authentication logs to Splunk index
     | route message_id == 109025, [
      // Mask usernames to protect PII
      | mask_usernames
      | fields -username 
      | eval index = "cisco_auth_logs"
      | into $splunk_destination
  ]
// // Archive rest of the logs to AWS S3
| into $aws_s3_destination;

Next steps

These resources might help you understand and implement this guidance: