Transforming unified network and Telco observability with Fabrix.ai and Splunk
In modern telecommunications environments, achieving autonomous, scalable, and efficient service assurance is critical. By combining the Fabrix.ai Data Fabric and low-code telemetry pipelines with Splunk ITSI (ITSI), organizations can unify structured and unstructured data in real-time. This integration enhances operational intelligence, accelerates insights, and streamlines IT operations for better service assurance.
Data required
Installation and configuration
Robotic Data Automation Fabric (RDAF) is Fabrix’s Data Fabric Technology Add-on for Splunk. It is installed in your data center, similar to how Splunk heavy forwarders are installed. RDAF installation involves deploying a few VMs at a centralized location, and typically a Splunk or Cisco subject matter expert is involved in this installation and configuration process.
Getting data in
Connect to data sources
Using RDAF, you can seamlessly connect to more than 1,000 data sources using pre-built solution packs. The integration process is simplified to just three clicks via the graphical user interface (GUI).
For example, the solution pack for Cisco Catalyst Center demonstrates how to dynamically enrich and ingest data into the Splunk platform using telemetry pipelines. These pipelines continuously process data ingestion via the Splunk HTTP Event Collector (HEC). The pack includes the following key features:
- Version: 9.0.1
- Folder Path:
/collection/network/cisco/campus_catalyst_center
- Published Date: April 30, 2025
- Publisher: Cloudfabrix
- Package Type: Collection
Other popular solution packs include:
- Fabrix Inventory Collection Base Pack
- Cisco vManage
- VMware vROs
- Cisco Meraki
- VMwareCenter
- Base Pack
Automate data ingestion and enrichment
RDAF enables automated data ingestion into Splunk indexes, ensuring that data is enriched with context and ready for visualization. These service blueprints (pipelines) are scheduled to run continuously for data ingestion using the Splunk HEC. These general steps described the process to set up automated data ingestion, but you should use the Fabrix.ai Splunk integration documentation for specific guidance.
- Use Splunkify and telemetry pipelines to automate data ingestion.
- Process data through discovery and ingestion into P-streams before routing it into Splunk indexes.
- Ingest data into Splunk Enterprise using Splunk HEC.
- Verify enriched data using RDAF dashboards.
The ingestion process supports any type of data, ensuring it is normalized and enriched before being visualized in the Splunk platform.
Use Splunkbase apps for visualization
When data has been ingested into Splunk indexers, you can use the Fabrix.ai Splunkbase apps to visualize the data and derive insights. Two primary Splunkbase apps are available:
- CloudFabrix Asset Analytics App for Splunk: Use this for asset visibility and insights through application dependency mapping and enriched data from data center, campus, edge, and SD-WAN environments. Use cases include data center modernization, application rationalization, lifecycle management, capacity utilization, real-time asset data collection, and CMDB updates.
- CloudFabrix Unified Network Observability: Use this for comprehensive network visibility across a data center, campus, SD-WAN, mobile transport, or edge networks. It lets you integrate with existing network management systems, element managers, and directly with network devices to continuously ingest diverse telemetry data, including metrics, logs, events, and SNMP traps. The app intelligently correlates data across network domains to reduce alert noise, deliver actionable insights, and accelerate root cause analysis to enhance network reliability and performance.
You can also use the RDAF dashboards for data visualization before data is fully ingested into Splunk indexes.
ITSI use cases
Automate ITSI service decomposition with RDAF
Fabrix.ai uses its workflow engine to automate ITSI service decomposition, significantly reducing the time required for service decomposition, and enabling rapid deployment and configuration. This is done through RDAF’s dynamic discovery capabilities. This includes completing all the following tasks:
- Automatically create entities with vital metrics
- Create KPIs and services
- Create aggregation policies
- Create correlation searches
- Modify the JSON document to meet your requirements during deployment
The following dashboards show the workflow automation engine and the Splunk platform workflow automation.
Utilize ITSI content packs for unified observability
You can use pre-configured ITSI content packs to significantly reduce the time to value for service assurance use cases. Fabrix.ai offers custom content packs for unified network observability and telecommunications service assurance. The benefits of these packs include:
- Visibility into
- KPIs
- Entities
- Notable events
- Episodes
- Root cause analysis (RCA)
- Deep link cross launch of controller tools to aid RCA
If you are interested in setting these up in your Splunk ITSI deployment, contact your Splunk sales team.
Telco Service Assurance Custom Content Pack
Unified Network Observability Custom Content Pack
Next steps
For more information, see the following Fabrix.ai resources:
Fabrix.ai the inventor of Robotic Data Automation Fabric (RDAF), delivers integrated, enriched, and actionable telemetry pipelines to operational and analytical systems. RDAF unifies Observability, AIOps, and Automation for Operational Intelligence Systems and enriches analytical systems. Fabrix.ai empowers business and IT leaders with AI-powered actionable intelligence to make faster and better decisions, and to accelerate IT planning and autonomous operations. Fabrix.ai is a Splunk Partnerverse Build partner that integrates its Data Fabric with the Splunk platform to accelerate data integration and enrichment, while automating ITSI content packs and providing enhanced visibility into complex telecommunications and unified networking deployments. For more information, visit Fabrix.ai.
The user- and community-generated information, content, data, text, graphics, images, videos, documents and other materials made available on Splunk Lantern is Community Content as provided in the terms and conditions of the Splunk Website Terms of Use, and it should not be implied that Splunk warrants, recommends, endorses or approves of any of the Community Content, nor is Splunk responsible for the availability or accuracy of such. Splunk specifically disclaims any liability and any actions resulting from your use of any information provided on Splunk Lantern.