Integrating Gigamon Deep Observability Pipeline with the Splunk platform
The need for real-time network visibility and monitoring is paramount. Gigamon's Deep Observability Pipeline (DOP) is a powerful tool designed to provide in-depth network visibility, traffic monitoring, and data intelligence.
Integrating this data with the Splunk platform can help enhance enterprise network security posture, optimize performance, and gain actionable insights. In this article, we'll guide you through the steps required to configure the integration of network-derived intelligence data from the Gigamon Deep Observability Pipeline into both Splunk Enterprise and Splunk Cloud Platform.
Data required
Prerequisites
Before proceeding, ensure you have the following prerequisites:
- Gigamon Deep Observability Pipeline set up in your environment
- Splunk Enterprise or Splunk Cloud Platform instance with administrative access
- Gigamon Deep Observability App for JSON data or Gigamon Deep Observability App for CEF data
- Gigamon CIM Add-On (required only for applications that need Common Information Model (CIM) compliance)
- Basic admin-level knowledge of the Splunk platform and Gigamon Deep Observability Pipeline
Steps for integration with Splunk Enterprise and Splunk Cloud Platform
Step 1: Prepare Gigamon for integration
-
Configure Gigamon Observability Gateway Gigamon's GigaVUE V Series virtual visibility nodes (VSN).
The feed to Application Metadata Intelligence (AMI) should be from various tapped sources.
- Deploy a V series node (for Application Metadata Exporter) with traffic acquisition method as the Customer Orchestrated Source.
- Create a Monitoring Session (Rep1 (In) > AMX > Rep 2 (Out)). Below is the pictorial flow of the solution. See Gigamon docs for more details.
-
Enable the HTTP Event Collector (HEC) on the Splunk platform.
As an alternative to using the HEC to send Gigamon data to the Splunk platform, you can use Kafka as a message bus between Gigamon and the Splunk platform. For more details refer to Gigamon and Splunk documentation.
For the Splunk platform to receive HTTP-based data from Gigamon DOP, you need to enable the HTTP Event Collector (HEC) on your Splunk instance.
- Navigate to Settings > Data Inputs in the Splunk platform and enable HTTP Event Collector.
- Create a new token to be used by the Gigamon DOP when sending data. This token ensures that the data is securely authenticated. Make a note of this token for use later.
- When prompted to select the source type, select New and enter the value
gigamon:traffic
. If you decide to use any other index, thegigamon_index
macro in the Gigamon Deep Observability App for Splunk must be updated accordingly. - When prompted to select an index for DOP data, select index
main
. If you decide to use any other index, update thegigamon_index
macro in the Gigamon Deep Observability App for Splunk.
For additional information on how to configure the HEC, refer to Splunk documentation.
Step 2: Configure the Splunk platform
- Download the app and add-on from Splunkbase:
- Gigamon Deep Observability App for JSON data
- Gigamon Deep Observability App for CEF data
- Gigamon CIM Add-On (required only for applications that need CIM compliance)
- Navigate to Apps > Manage Apps in your Splunk instance.
- Click Install App from File and upload the downloaded packages.
Step 3: Enable the Gigamon Application Metadata Intelligence (AMI) to send data to the Splunk platform
To enable AMI to send data to the Splunk platform, you'll need to configure Splunk platform details in the Gigamon Application Metadata Exporter (AMX).
- To add a monitoring session, click on AMX and provide details as follows:
- In Cloud Tool Exports, add Splunk.
- In Alias, enter
Splunk_Cloud.
- In Cloud Tool, choose Other from the dropdown.
- In Endpoint, enter one of the following:
- For Splunk Cloud Platform: <
protocol>://http-inputs-<host>.splunkcloud.com:<port>//services/collector/event.
For more details see the Splunk Cloud Platform HEC config documentation. - For Splunk Enterprise:
<protocol>://<splunk-instance-uri>:<port>//services/collector/event.
For more details see the Splunk Enterprise HEC config documentation.
- For Splunk Cloud Platform: <
- In Headers, enter
Authorization: Splunk 2***********************
Replace the hidden part with the token you created in step 1.2, and select the Secure Keys box. - Click + and add another header:
Content-Type: application/json; charset=utf-8
. Select the Secure Keys box.
- In Alias, enter
- Click Save and deploy the monitoring session.
Step 3: Test the integration
In the Splunk Search and Reporting application, query for the incoming data that matches the source type gigamon:traffic
. For example (assuming you configured your HEC to save data in index=main
):
index=main sourcetype=”gigamon:traffic”
If the query returns results, you have successfully configured integration with Gigamon and you are receiving and indexing data.
When data is received into the Splunk platform, the dashboards in the Gigamon Deep Observability App you installed in step 2.1 will be populated with the incoming data.