Splunk Searching with REST API
As a way to justify essentially useless equipment around my house, I wanted to make a Raspberry Pi driven display board.
This display board would be simple enough to just present a number of Splunk dashboards on the display, while being able to avoid running a window environment, web browser, and all of the associated overhead on my relatively weak Pi Zero W. Therefore, I wanted a way to display all of the data with the console.
I was able to complete this task utilizing the documentation that Splunk has provided for searching via the REST API. I don’t think they had a good proof of concept that showed a fully working use case; however, their documentation on all the available features is quite in-depth:
One of the things I wanted to display was the count of accepted and blocked connections through my firewall. This data is already indexed on my local Splunk instance so all I have to do is search for it. The local Splunk instance is running on IP address 192.168.0.70 with the default REST interface running HTTPS on TCP 8089.
We can accomplish my goal one of two ways. We can run the search on a schedule and then pull the results right away, or we can pull the results of a scheduled saved search.
I wanted to implement the gathering of results with a cron-scheduled bash script, so I decided to write the script with the scheduled search method. Once completed, I conducted tests by also running the searches via the REST API so I have documented that method as well.
Run an ad-hoc search with REST API
The returned response contains a “SID” (search ID) which we need to utilize in our call to get the search results back.
Retrieve ad-hoc search status with REST API
We want to know if our job is done so we need to check on its “dispatchState”.
Retrieve ad-hoc search results with REST API
We can grab the results from the job and by default, our response will be XML.
For my use case, it would be better to use csv. So we will need to convert the query to a GET request and specify the output method.
Creating a Saved Search via the REST API
I do not want to have to run the search every single time I need to pull the results, so I will schedule a saved search that runs automatically.
A successful entry will return the parameters of the saved search in the response.
Retrieve Saved Search SID via the REST API
Again, we need to retrieve the SID to output the results. But this time, since we don’t run the job ad-hoc, we need to query the history of the saved search to retrieve the SID. Since the output is quite verbose, all we need is the newest SID. I am using regex to parse the output to find the correct, and last URL within the <id></id> brackets.
Retrieve Saved Search Results via the REST API
Now I want the results in a flat csv file so I will request the results of the previously retrieved link with output_mode set to csv.
Bash Script to Update a Local CSV with the Results
Finally, I combined the manual steps above into a simple script that can be executed by a scheduled cron job. I am using the lastpass-cli to load the credentials into the script so they are not hardcoded.
In this particular case, I have the credentials saved in a site called “testsplunk.” This outputs the CSV results to a file called “output” in the local directory. I would not recommend using this script in a production environment as there is no error checking or input parsing steps beyond what the Splunk REST API does automatically. There is also no mechanism in this script to maintain an active login to the lastpass-cli, and that would need to be accomplished outside of the script.
Conclusion
While nothing in this exercise was particularly challenging, I found it to be fun to interact with Splunk in a way I had not previously been tasked with. I found the REST API to be easy to work with and quite extensible. It makes for an excellent way to get results out of Splunk without relying on the Web GUI.
About Hurricane Labs
Hurricane Labs is a dynamic Managed Services Provider that unlocks the potential of Splunk and security for diverse enterprises across the United States. With a dedicated, Splunk-focused team and an emphasis on humanity and collaboration, we provide the skills, resources, and results to help make our customers’ lives easier.
For more information, visit www.hurricanelabs.com and follow us on Twitter @hurricanelabs.
