Skip to main content
 
 
Splunk Lantern

Showing the value of your Splunk deployment

 

Many everyday Splunk customers get a lot of value from Splunk, are proud to document success stories, and even share them in person on the big stage at .conf . We want you to be clear about your own success with your Splunk deployment, and communicate it both externally and internally. There are two main ways you can show the value of your splunk deployment - through showing the range of use case outcomes you've achieved, to tracking Splunk utilization by your organization's business units or end users.

Tracking use case completion means you can identify the specific use cases that matter most to your unique organization. Understanding the goal of a use case can help prioritize data ingestion and technical resources.

As use cases are being explored and implemented, a showback plan can help to quantify user adoption rates. Showback is a method of tracking Splunk utilization by your organization's business units or end users. Unlike chargeback, showback keeps track of utilization without actually cross-charging those costs back to the departments that use them.

Tracking value through use case completion

The Value Realization Cycle

The Value Realization Cycle is a continuous process for identifying, implementing, and evaluating the value of new use cases that your business adopts.

In order to get maximum value out of your investment in Splunk products, it's vital to intentionally put a Value Realization Cycle in place. We recommend that your organization establishes a use case planning session at least every other month to ideate and refresh new use cases to be deployed.

Splunk UCX VR Cycle.jpg

The Value Realization Cycle consists of a few different steps:

  • Step 1 - Define a situation and goal. This can be a very targeted tactical challenge or a more expansive one. For example, you might aim to reduce the mean-time-to-detect (MTTD) of security-related incidents or to automate your incident response.
  • Step 2 - Identify and record use cases. There could be one or more use cases that help you improve the situation and achieve the goal you defined in step 1.
  • Step 3 - Deploy use case(s) and document the value achieved. Use the registry to help you track and document your progress.

And then, repeat!

The Use Case Registry

The Use Case Registry is a tool used to track the specific use cases you want to deploy, along with the names of use case implementation owners and timelines for completion.

As well as running a regular use case planning session at least every other month, we recommend that you have weekly or bi-weekly team meetings focused on the Use Case Registry to checkpoint and track your progress in achieving the value you've identified from the use cases you choose to implement.

You can create the Use Case Registry in a spreadsheet or project management tool of your choice. Here is an example of a Use Case Registry you can use, or you can download this template file. You might want to add extra columns to these examples to track additional information such as status, prerequisites, or comments.

For more information on using the Value Realization Cycle and Use Case Registry, see the Security Use Case Explorer and the Observability Use Case Explorer.

Tracking utilization through a showback plan

Showback is a method of tracking Splunk utilization by an organization's business units or end users. Unlike chargeback, showback keeps track of utilization without actually cross-charging those costs back to the departments that use them.

A showback plan is a way to demonstrate that you are getting your money's worth from your Splunk investment. Metrics and cost tracking are a way to measure that value and allocate it to the users and beneficiaries of that value.

A showback practice is most effective for implementations with more than one constituency that are scoped as Splunk as a service or Splunk as a strategy. A showback plan may be more formal than is needed for Splunk-as-a-solution deployments that serve only one team, group, or purpose. See Determine the purpose and scope of your Splunk deployment for an explanation of charter levels.

Having a data-driven showback plan can be an effective way to foster positive competition among teams to make their Splunk usage more effective by challenging them to unlock more value per unit of cost. A showback plan should drive adoption and improve efficiency, not just track cost centers.

Measure utilization on a scale

Start by establishing a benchmark for your Splunk usage. Your initial benchmark will be a guess, but as your user community spends time using your Splunk deployment, these estimates will become more accurate and detailed.

Next, design a measurement plan that matures over time. Start out with what you can easily measure, and add in elements that are harder to measure over time when you have usage patterns on which to base your estimates. Communicate this plan to your user community. For example:

  • Years 1 and 2: measure license usage only
  • Year 3: measure license and hardware costs
  • Year 4: measure license, hardware, and support costs

This phased approach gives everyone in your organization time to establish their Splunk usage and value realization while getting licenses properly funded. Adjust the timing as needed.

Establish baseline agreements

  • Establish baseline data usage agreements. Include storage retention solutions and staff overhead.
  • Base your license usage requirements on near-maximum usage. Using near-maximum measurements rather than averages ensures that the baseline you establish has enough headroom for meeting daily volume obligations without exceeding license boundaries.
  • Provide a fixed chargeback cost and account for overages. This provides an incentive for users to be good stewards of usage.
  • Avoid providing discounts for under-usage. This can drive less adoption.

Include a variety of factors in usage measurements

  • Keep measurements general. Include general usage measurements, such as license and overall memory usage. Avoid measuring minute details, such as the search runtime or memory usage of individual use cases.
  • Consider indirect usage. Factor in outside resources, such as storage retention solutions, hardware needs, and support usage.
  • Include predictions. Predictions help stakeholders see their growth over time. Predictions help set expectations, and keep planning and communications proactive.

Measure support time and grade on a scale

When measuring support time, design a plan that encourages adoption. If you measure support time at a flat rate, it can drive less adoption (for example, users may think if they use the Splunk deployment less, they will need less support and get charged less). You can grade groups based on percentage of certified users, search maturity, add-on support, and support time. This can build an incentive for users to increase their Splunk adoption. For example:

  • Group A wrote a PowerShell script to pull exactly what they needed and maintained the app themselves so the support group didn't have to. Support time was trivial.
  • Group B spent less time learning and relied on the support group to build tooling for them. Support and training time was greater.

Next steps for implementing a showback plan

The Splunk App for Chargeback from Splunkbase is a great resource to start implementing a showback plan.

The Splunk App for Chargeback provides an easy-to-use experience to analyze how internal business units are leveraging Splunk. The App provides the framework necessary for both chargeback and showback use cases for:

  • Splunk Virtual Compute (SVC)
  • Dynamic Data: Active Searchable (DDAS)
  • Dynamic Data: Active Archive (DDAA)
  • Dynamic Data: Self-Storage (DDSS)

Go further with Splunk Outcome Paths

If the guidance on this page helped you, you might also be interested in assessing capacity needs. That article is part of a series of prescriptive strategies designed to drive you toward your business and technical goals by mitigating risk in the way you work with the Splunk platform.