Do you like to play games? How about hide and seek? Or maybe seek and find games, like Hidden Expedition? While hide and seek may sound like fun, it’s not so fun when you have to look for issues or investigate incidents within your environment logs. But maybe seek and find offers a better option?
For me–and a lot of others–it’s not fun, and it’s time-consuming. I’d much rather locate something someone needs or track down an issue sooner rather than later. If you are reading this, then you know that Splunk is the answer, but not necessarily when it’s freshly out of the box. There may be a few things you’d have to tweak for it to be beneficial to you.
Let’s unbox this a little bit, shall we?
Although you can ingest just about any log in Splunk, it’s not always going to be a useful log–at first. Splunk has a lot of built-in extractions that will help with normalizing data. But what about what Splunk doesn’t have built-in? There is likely a technical add-on (TA) for that.
A TA will help normalize your data. What do I mean by normalize? The most simple way to explain this would be “fields.” If you look at your data, there are values in the data. The TAs then apply fields to the values. The fields are what becomes “normal.” Normalized fields are almost always what’s useful within Enterprise Security and data models. I’ll tell you more about Enterprise Security in an upcoming post (so stay tuned!), but in short, the fields are what is expected, which makes them common. You can search for common fields and find the values so much easier than without it. In addition, searching for more specific information, like the common fields/values, is more efficient than just a search with values.
That sounds great, but what about the proprietary applications in my environment? Is there a TA for that? Probably not–but don’t worry. Creating field extractions are not as hard as you think, though probably not as easy as you think, either. You can create field extractions either via the webUI or via the CLI.
For amateurs, I would recommend the UI, as there are guided steps for that–by the way, if you are interested in a tutorial for creating field extractions, leave a comment and I will be sure to get that done for you. Keep in mind that not all values will have fields that are “common,” but having a field is better than just leaving the value with no field.
Ultimately, normalize what you can
Let’s face it, not all data will be normalized–some data may just be too messy–but it’s still needed. My advice is to normalize what you can. It makes things run a lot smoother. Not only will searching the data be simpler for whomever uses Splunk, but the indexing tier where the data is stored would also appreciate less load. Don’t play hide and seek with your data. Seek and find is the way to go.