Skip to main content
 
 
 
Splunk Lantern

Managing time ranges in your searches

 

Usually in a Splunk platform search, time is controlled by a drop-down menu that you can use to pick from a large range of time options. However, not all of your data will have a traditional event timestamp. You need some other ways to be able to manipulate time in your searches.

Solution

Time has no real format or substance. It's stored in the internal field _time in epoch time format. It displays as a human readable date when used in a dashboard (unless you choose to do something different with it), but underneath, it's still epoch2.

So what happens in the background when you switch the time picker from yesterday to year-to-date?

When you select "Year to Date" from the preset choices, the search sets off two fields:

  • info_min_time - earliest time to search after (2024-01-01 00:00:01 in this example)
  • info_max_time - either the latest time to search for (less then 2024-07-08 10:00:00Z in this example) or if it's all time, it'll be set to "+Infinity"

Assuming we aren't running an all-time search, what happens in the background is a comparison. If _time of the event is greater then info_min_time and less then info_max_time, keep the event. Otherwise, exclude events that fall outside this window.

Example

Let's use a complicated example to see how this understanding can help us. In this example, we have a lookup file with a bunch of events, including two fields that contain timestamps.

Lookup.png

It has two possible fields that can be used as event time: Case_CreatedDate and Case_ClosedDate. How do we make use of these fields?

The base of our search is going to be as follows:

|inputlookup sfdc_lookup
|where isnotnull(Case_ClosedDate) AND isnotnull(Case_Area) AND isnotnull(Case_Product)

Step 1

We need to convert either field (we'll choose Case_CreatedDate) to a format that _time can use, which is epoch. We do this by adding the following command to our query:

|convert timeformat="%Y-%m-%dT%H:%M:%S.%3N%z" mktime(Case_CreatedDate) AS Case_Created_Epoch

The query depends on the time format itself. You can find help on using theconvert command in Splunk Docs.

Step 2

We need to put the value we are going to use as time in the right place. So, we'll rename or copy the value to _time:

|eval _time = Case_Created_Epoch

Step 3

To make sure things are sorted in time order now, add the following:

|sort 0 - _time

Step 4

Re-run the search, and the events will come back in chronological order based on the Case_CreatedDate timestamp. Unfortunately, this doesn't mean you can use the time picker to limit events quite yet.

In order to be able to use the new _time field with the time picker, we need to specifically tell the Splunk platform that we want to limit it. Remember the mention in the beginning of this article about info_min_time and info_max_time? This is where we use those fields.

Add the following to the search:

|addinfo
|where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity")

The addinfo command adds the usually hidden fields to the search results - such as the following fields:

info_min_time
info_max_time
info_search_time
info_sid

By using the addinfo command in combination with the where statement, the time picker will work with the search results to limit to specific times.

More time manipulation options

Now that we’ve unlocked the secret of _time manipulation, it opens the door to a bunch of other things you can do with this. Below are a few examples of _time manipulation I’ve done in the past. These are not officially recommended uses, so use them at your own risk.

Trick 1

You can modify info_min_time and info_max_time to dynamically change the search window. For example, in the search below, we're looking wanted to make sure we can get the number of cases for the month prior to the selected window. In other words, having the time picker set to between February and June, but grabbing the counts from Jan to June. Then resetting the search time back to what the time picker was set to.

| inputlookup sfdc_lookup 
| WHERE isnotnull(CloudStack_Name)
| convert timeformat="%Y-%m-%dT%H:%M:%S.%3N%z" mktime(Case_CreatedDate) AS Case_Created_Epoch 
| convert timeformat="%Y-%m-%dT%H:%M:%S.%3N%z" mktime(Case_ClosedDate) AS Case_Closed_Epoch 
| eval _time=Case_Created_Epoch 
| sort 0 - _time 
| addinfo 
| where _time>=(info_min_time- 2592000) AND (_time<=info_max_time OR info_max_time="+Infinity") 
| eval missingfca = if(isnotnull(Case_ClosedDate) AND isnull(Case_FCA),"MISSING_FCA","GOOD") 
| eval AreaFCACombo = if(isnotnull(Case_FCA),Case_Area + ":" + Case_FCA,Case_Area + ":" + "FCA_NOT_SET") 
| bin span=1mon@mon _time 
| chart dc(Case_Number) AS CaseCount OVER AreaFCACombo BY _time 
| fillnull value=0 
| untable AreaFCACombo,_time,CaseCount 
| streamstats current=f global=f window=1 latest(CaseCount) AS Prev_Count by AreaFCACombo 
| eval Percent_Change = round(((CaseCount - Prev_Count)/Prev_Count)*100,2) 
| fillnull value=0 
| addinfo 
| where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity") 
| eval Change_Trend=case(Percent_Change<0,"DOWN",Percent_Change=0,"FLAT",Precent_Change>0,"UP") 
| rex field=Percent_Change "-?(?<change>.*)" 
| fillnull value=0 
| fields - info_* 
| eval AlertThresholds=case((change>=0 and change<=10) OR (Case_Count<=10 OR Prev_Count<=10),"GREEN",(change>10 AND change<=25) AND (Case_Count>10 OR Prev_Count>10) ,"YELLOW",change>25 AND (Case_Count>10 OR Prev_Count>10),"RED") 
| eval month=strftime(_time,"%Y-%m") 
| chart sum(CaseCount) AS counting OVER AreaFCACombo BY month

Trick 2

Sometimes you just want a panel to show specific timeframe (for example, six months) of data but there are other panels that require different time settings. By using a combination of manipulating info_min_time / info_max_time along with relative_time, you can create a window of time that ignores the time picker.

In this case, we get data from the first day of six months ago up to the end of the previous month, even if the time picker had different time settings.

|inputlookup sfdc_lookup
| where isnotnull(Case_ClosedDate) AND isnotnull(Case_Area) AND isnotnull(Case_Product)
|convert timeformat="%Y-%m-%dT%H:%M:%S.%3N%z" mktime(Case_CreatedDate) AS Case_Created_Epoch
| eval _time = Case_Created_Epoch
| sort 0 - _time
| addinfo
| eval info_min_time=relative_time(now(),"-6mon@mon")
| eval info_max_time =relative_time(now(),"@mon")
| where _time>=info_min_time AND (_time<=info_max_time OR info_max_time="+Infinity") 
| table *

Next steps

Do you have any more tricks or tips when manipulating time? Log in using the icon in the upper right and then us know in the article comments.

These additional Splunk resources might help you understand and implement this product tip: