Skip to main content

 

Splunk Lantern

Integrating Gigamon Deep Observability Pipeline with the Splunk platform

 

The need for real-time network visibility and monitoring is paramount. Gigamon's Deep Observability Pipeline (DOP) is a powerful tool designed to provide in-depth network visibility, traffic monitoring, and data intelligence.

Integrating this data with the Splunk platform can help enhance enterprise network security posture, optimize performance, and gain actionable insights. In this article, we'll guide you through the steps required to configure the integration of network-derived intelligence data from the Gigamon Deep Observability Pipeline into both Splunk Enterprise and Splunk Cloud Platform.

Prerequisites

Before proceeding, ensure you have the following prerequisites:


Steps for integration with Splunk Enterprise and Splunk Cloud Platform

unnamed - 2025-05-06T113427.613.png

Step 1: Prepare Gigamon for integration

  1. Configure Gigamon Observability Gateway Gigamon's GigaVUE V Series virtual visibility nodes (VSN).

    The feed to Application Metadata Intelligence (AMI) should be from various tapped sources.

  1. Deploy a V series node (for Application Metadata Exporter) with traffic acquisition method as the Customer Orchestrated Source.
  2. Create a Monitoring Session (Rep1 (In) > AMX > Rep 2 (Out)). Below is the pictorial flow of the solution. See Gigamon docs for more details.

unnamed - 2025-05-06T113528.382.png

  1. Enable the HTTP Event Collector (HEC) on the Splunk platform.

     

    As an alternative to using the HEC to send Gigamon data to the Splunk platform, you can use Kafka as a message bus between Gigamon and the Splunk platform. For more details refer to Gigamon and Splunk documentation.

    For the Splunk platform to receive HTTP-based data from Gigamon DOP, you need to enable the HTTP Event Collector (HEC) on your Splunk instance.

    1. Navigate to Settings > Data Inputs in the Splunk platform and enable HTTP Event Collector.
    2. Create a new token to be used by the Gigamon DOP when sending data. This token ensures that the data is securely authenticated. Make a note of this token for use later.
    3. When prompted to select the source type, select New and enter the value gigamon:traffic. If you decide to use any other index, the gigamon_index macro in the Gigamon Deep Observability App for Splunk must be updated accordingly.

      unnamed - 2025-05-06T114726.691.png

    4. When prompted to select an index for DOP data, select index main. If you decide to use any other index, update the gigamon_index macro in the Gigamon Deep Observability App for Splunk.

    For additional information on how to configure the HEC, refer to Splunk documentation.

Step 2: Configure the Splunk platform

  1. Download the app and add-on from Splunkbase:
  2. Navigate to Apps > Manage Apps in your Splunk instance.
  3. Click Install App from File and upload the downloaded packages.

Step 3: Enable the Gigamon Application Metadata Intelligence (AMI) to send data to the Splunk platform

To enable AMI to send data to the Splunk platform, you'll need to configure Splunk platform details in the Gigamon Application Metadata Exporter (AMX).

  1. To add a monitoring session, click on AMX and provide details as follows:
  2. In Cloud Tool Exports, add Splunk.
    1. In Alias, enter Splunk_Cloud.
    2. In Cloud Tool, choose Other from the dropdown.
    3. In Endpoint, enter one of the following:
      • For Splunk Cloud Platform: <protocol>://http-inputs-<host>.splunkcloud.com:<port>//services/collector/event. For more details see the Splunk Cloud Platform HEC config documentation.
      • For Splunk Enterprise: <protocol>://<splunk-instance-uri>:<port>//services/collector/event. For more details see the Splunk Enterprise HEC config documentation.
    4. In Headers, enter Authorization: Splunk 2*********************** Replace the hidden part with the token you created in step 1.2, and select the Secure Keys box.
    5. Click + and add another header: Content-Type: application/json; charset=utf-8 . Select the Secure Keys box.unnamed - 2025-05-06T114746.289.png
  3. Click Save and deploy the monitoring session.

Step 3: Test the integration

In the Splunk Search and Reporting application, query for the incoming data that matches the source type gigamon:traffic. For example (assuming you configured your HEC to save data in index=main):

index=main sourcetype=”gigamon:traffic”

If the query returns results, you have successfully configured integration with Gigamon and you are receiving and indexing data.

unnamed - 2025-05-06T114806.597.png

When data is received into the Splunk platform, the dashboards in the Gigamon Deep Observability App you installed in step 2.1 will be populated with the incoming data.

unnamed - 2025-05-06T114826.175.png

unnamed - 2025-05-06T114833.211.png