The Coffee Report Part 1: Splunk, IoT, and… Coffee?

By |Published On: September 6th, 2017|

Disclaimer: I’ve demoed the content of this article only a couple of times and the reactions I’ve received have been an almost unanimous: “You don’t have kids, do you?” For those of you who do have kids, hopefully I’ll make the content in my blog simple and valuable enough that you’ll find the time to try out some iteration of this project. If not, prepare to live vicariously through me.

Instead of making you read an entire blog post to see one cool dashboard, I’m going to start out with the cool part first and work my way back to how it was developed. So, without further ado, allow me to introduce what I like to call – The Coffee Report.

Why did I make this?

As with most Internet of Things (IoT) projects I work on, I’m prepared to answer the obvious “Why did you even make this?” question. As with any data visualization, a dashboard is used to help tell a story or paint a picture. In my case, I’m looking to paint a couple different pictures, but mostly one about coffee.

I set out to find out how I consume it, and how I can be more environmentally conscious about how it’s made in my home. Sure, this may sound extreme. I’m aware that not everyone has the time to sit around and think about this level of detail about their coffee. But, it’s no secret that things like K-Cups are actually somewhat of an environmental disaster, and as a result something we should think about.

I’ve also created a screencast tutorial to go along with this blog post, so feel free to watch that as well.

Let’s break down the panels I have here and see how it can tell us about environmentalism.

Cups Saved – This panel can help to tell me how environmentally friendly I’m being (in relation to coffee at my house). In 2017, it’s important we all remain a little bit conscious of the world around us. Sure, it’s nice to stop at your favorite local coffee shop on the way to work, but have you considered how many single-use plastics are involved in that? This data specifically might not be terribly powerful, as it only relates to my own home, but the following visualization from certainly is pretty astounding. It tells us how many million tons of plastic are being generated each year vs how many millions of tons of plastic are being recycled.

I wanted to see how much of an impact I might be able to have on problems like this if I made slightly different choices in my daily routines. So, cups saved relates to k-cups (if I make my coffee at home), or plastic cups (if I choose to visit a major coffee brand not to be named here).

Average Brew Time – This panel is fairly self-explanatory, but I wanted to see if I could not only figure out how long each individual cup of coffee takes to make, but also on average, how long should I be expecting a cup to take? This type of mathematical function could help me later on to determine numerical outliers in my brew times. It could also help me paint a picture of really how convenient it is for someone like me to just make my own coffee in the morning rather than spend the time driving somewhere.

Yesterday’s Energy Usage – This is a panel that is pretty self-explanatory. What I’ve done is created a single-value visualization to show me how many kilowatt hours (kWh) my coffee maker used yesterday. This will show how much energy it takes me to brew a cup of coffee.

Brew Times – This is every single cup of coffee I’ve made during the time I specified using my time picker.

Brew Statistics – This is a detailed chart of various measures of energy usage over my specified time. I can look at Amps, Volts, Watts, and kWh all side-by-side with one another.

Total Energy Usage by Day – Last but not least, I can look at my energy usage per day. This visualization is very powerful because it shows me how much energy I’m drawing, even on days when I didn’t even make any coffee. Imagine how many other things we’re leaving plugged in in our homes that are drawing energy and providing us no value? An energy minimalist may take that information and start unplugging every appliance in their house. I can honestly say I’m not on that level, but it’s interesting to be able to quantify this.

Where did I get the data?

Retrieving this data was fairly straightforward once I had the correct tools in place. The IoT devices that made it all happen were the TP-Link HS110 and a Rasperry Pi.


– TP-Link HS110
– Raspberry Pi
– Splunk Universal Forwarder for ARM
– TP-Link HS110 Add-on for Splunk

Setting up the ingredients


The TP-Link HS110 comes with it’s own set of setup instructions, so I’m not going to reiterate them here. Suffice it to say that having a TP-Link HS110 on your network though is a very obvious required step for this project. The TP-Link HS110 has two nice features that made the project possible. For starters, it can measure energy usage in real-time. Unfortunately, the TP-Link HS100 does not have that feature, so be careful to choose the right device when you are purchasing. Second, after setting up the TP-Link HS110, this device is directly connected to WiFi in your home. This direct connections makes it completely possible for a Rasperry Pi to query it at any moment.


I have a Raspberry Pi running for all sorts of home automation pieces already, so in order to retrieve the data from my TP-Link HS110, I didn’t have much work to do. If you’ve never setup a Raspberry Pi before, this isn’t the blog for that topic, but there are plenty of tutorials online to get started. One specific nice feature of my Raspberry Pi is that It has a copy of the Splunk Universal Forwarder running on it. So for this project you’ll want to get your Raspberry Pi up and running and the Splunk Universal Forwarder installed. This made retrieving new data just a matter of writing the correct script and having it sent to my Splunk Enterprise installation. If you’re looking for an ARM compatible Splunk Universal Forwarder, it is freely downloadable via the Splunk Universal Forwarder download site.


With my Raspberry Pi in place and TP-Link hooked up, the rest was pretty straight forward. I created and installed a technology Add-on (mentioned above).

Putting it all together

One this data started flowing into my Splunk instance, I could see data that looked like the following. At this point it was a matter of writing some searches to make sense of all the data I had. In short, that’s how I ended up creating the dashboard displayed above.

At this point, over 30 days later, I am looking at around almost 375,000 data points for my TP-Link HS110. Is that overkill just to talk about coffee? Probably. But if you think that’s bad, in Part 2 of this blog post I’ll show you how that much data can make for a great Machine Learning Toolkit example. Thanks for reading and if you have any questions or comments feel free to reach out!

Share with your network!
Get monthly updates from Hurricane Labs
* indicates required

About Hurricane Labs

Hurricane Labs is a dynamic Managed Services Provider that unlocks the potential of Splunk and security for diverse enterprises across the United States. With a dedicated, Splunk-focused team and an emphasis on humanity and collaboration, we provide the skills, resources, and results to help make our customers’ lives easier.

For more information, visit and follow us on Twitter @hurricanelabs.