Skip to main content
 
 
Splunk Lantern

Using Edge Processor to filter out cardholder data for PCI DSS compliance

 

As a Splunk admin, your duties involve managing data ingestion, ensuring data security, and maintaining compliance with industry standards. Your organization uses Splunk Cloud Platform to monitor and analyze diverse data sources, including those potentially containing cardholder information. It's essential to ensure that card data, such as the primary account number (PAN), cardholder name, expiry date, and other related information, is appropriately stored in Splunk Cloud Platform.

Splunk Edge Processor (EP) offers the capability to filter, mask, and transform your data close to its source before routing the processed data to external environments. This article shows you how to use Splunk Edge Processor to ensure that sensitive cardholder data is filtered out before being ingested into Splunk Cloud Platform, allowing you to maintain PCI DSS compliance. You'll complete the following steps to do this:

Whether you choose to filter, mask, or truncate data depends on the type of data being handled. You can learn more about this in the Learn PCI DSS requirements section of this article. This article deals specifically with the filtering data before ingestion into Splunk Cloud Platform. To learn how to mask or truncate data before ingestion, see Using Edge Processor to mask or truncate cardholder data for PCI DSS compliance.

Data required

Prerequisites

  • Splunk Cloud Platform version 9.xx + or higher
  • A Splunk Edge Processor tenant with a paired Splunk Cloud Platform stack
  • A Splunk Edge Processor instance running on a machine with an accessible URL
  • A credit card data source (Or you can use this sample input script to generate synthetic data)

Learn PCI DSS requirements

PCI DSS requirement 3 states that cardholder data elements like the primary account number (PAN), cardholder name, service code, and expiration date must be protected whenever they are stored, transmitted, or processed. This is in contrast to sensitive authentication data elements like full magnetic stripe data, card verification value (CVV2), card validation code (CVC2), and card identification number (CID), as well as the PIN/PIN block. Storage of these elements is not permitted, even if encrypted.

In this article, we'll explore how to filter out sensitive authentication data elements to comply with the requirement that this data must not be stored after authorization, even if encrypted. Full requirements for these data types are:

  • Full magnetic stripe data (Track 1, Track 2)
    • PCI DSS ​​requirement: Do not store the full contents of magnetic stripe data after authorization.
    • ​​Implementation: Implement ingestion rules to not store the full contents of magnetic stripe data.
  • CAV2/CVC2/CVV2/CID data
    • PCI DSS ​​requirement: Do not store CCV or any equivalent card validation codes or values used during card authorization.
    • ​​Implementation: Implement ingestion rules to not store CCV or any equivalent card validation codes or values used during card authorization.
  • Personal Identification Number (PIN) and PIN block
    • PCI DSS ​​requirement: Do not store the PIN/PIN block used for ATM or POS transactions.
    • Implementation: Implement ingestion rules to not store the PIN/PIN block used for ATM or POS transactions.

After you've identified the data that needs to be filtered out, you can begin to build your pipeline.

Build the pipeline

  1. Ensure that the correct source types are available for identifying card data. The table below outlines recommended source types, associated fields, and typical data sources for credit card data across financial applications.

    Suggested source type

    Fields

    Sources

    cc_transaction_logs

    cc_number, pan_number, cc_type, ccv, expiry_data, pin, pin_block, transaction_type, cardholder_name, track1, track2, track3

    • Payment gateways
      • Stripe, PayPal, Clover
    • Banking systems
      • Core banking platforms
    • E-commerce platforms
      • Shopify, Magento
    • Point of Sale (POS) systems
      • Retail transaction systems
  2. Create an SPL2 pipeline to truncate or mask the data before it's stored or displayed. The pipeline below takes in credit card transaction data, does a regex match to confirm data coming is sensitive data, and drops those fields. After dropping fields from data ingested, the pipeline then sends the cleaned data to an index in Splunk Cloud Platform.

    Pipeline definition (SPL2)

    $source

    function drop_non_storable($source): string
    {
    return
    | from $source
    | eval is_track1_valid = match(track1, /^%B[0-9]{0,19}\^[A-Za-z0-9_ ]{2,26}\^[0-9]{7}[A-Za-z0-9_]*\?$/)
    | eval is_track2_valid = match(track2, /^[0-9]{13,19}=[0-9]{7}\?$/)
    | eval is_pin_valid = match(pin, /^[0-9]{4}$/)
    | eval is_pin_block_valid = match(pin_block, /^[0-9A-F]{16}$/)
    | eval is_ccv_valid = match(ccv, /^[0-9]{3,4}$/)
    | where is_track1_valid and is_track2_valid and is_pin_valid and is_pin_block_valid and is_ccv_valid
    | fields - track1, track2, pin, pin_block, ccv, is_track1_valid, is_track2_valid, is_pin_valid, is_pin_block_valid, is_ccv_valid
    }

    $pipeline = | from $source
    | drop_non_storable
    | eval index = "cc_transactions_clean_index"
    | into $destination;

    sourcetype= cc_transactions

    The SPL2 above checks the following fields via regex, and if they match will drop the fields from the events:

    • track1
    • track2
    • ccv
    • pin_block
    • pin

    Your final pipeline should look like this:

    clipboard_edf1391d1bb2897adb9a0aac15eddd88a.png

  3. After you have constructed your pipeline, follow these instructions to save and apply your pipeline:
    1. Test your pipeline rule. In the top right corner of the screen, click the blue Preview button.
    2. Set the Data destination to the appropriate index, for example: cc_transactions_clean_index.
    3. To save the destination, click Apply.
    4. In the top right corner of the screen, click Save pipeline.
    5. Give your pipeline a suitable name, for example: cc_transactions_pci_non_storage.
    6. To save your pipeline, click Save.
    7. To try out the new pipeline, click Pipelines on the top left of the page.
    8. Locate the pipeline you just created, click the three dots next to your new pipeline, and select Apply/remove.
    9. Select the Splunk Edge Processor you created earlier and click Save. You will see a brief message stating that your changes are being saved.

Test the pipeline

To verify that your pipeline has been successfully applied:

  1. Log into the Splunk platform and open the Search app.
  2. Run the following search and verify that you see the events coming from this pipeline:
    index=cc_transactions_clean_index sourcetype=cc_transactions
  3. Verify that no prohibited cardholder data elements (for example, full magnetic stripe data) appear in the results. Cross-check with the pipeline rules you implemented to ensure compliance.

Monitor data ingestion

To ensure your data ingestion pipelines comply with PCI DSS requirements, it is critical to perform ongoing monitoring. You should regularly review data metrics to ensure that incoming and outgoing data aligns with expected volumes and formats. For more information, see Monitor system health and activity for Edge Processor.

Next steps

To enhance your use cases and maintain compliance, the Splunk App for PCI Compliance with Splunk Enterprise or Splunk Enterprise Security can assist in addressing challenges related to PCI audits. This app, developed and supported by Splunk, is designed to help organizations meet PCI DSS 4.0 requirements. It evaluates and monitors the effectiveness and current status of PCI compliance technical controls in real-time. Additionally, it can identify and prioritize control areas that might need attention, allowing you to promptly address auditor reports or data requests.

The Splunk App for PCI Compliance provides businesses with the advanced tools to meet PCI DSS requirements and automate monitoring. As cyber threats change, utilizing the Splunk platform means being ready for any challenge while maintaining compliance.

In addition, these resources might help you understand and implement this guidance: