Logs in a container, get it?

Quick Splunk analysis

I had a situation where I need to do a one-off analysis on some unstructured logs and I didn’t have a great way of doing it (at the time).

Most log analytics platform I use require a client/agent pipeline to massage and send the data to their engines. Splunk however, has the ability to ingest log data on the fly and worry about the massage the data when querying, and it has the ability to ingest data as part of the main install itself, no need to run/configure another agent to get the data in for analysis. Splunk’s Free tier allows 500MB/day to be ingested on a standalone instance!

It was a perfect fit for my one-off analysis! With this in mind, it became an easy task to do with Docker. Now, admittedly, the steps to get to this small code snippet took a while, but since I did this once, it has come in handy numerous occasions since.

The magic

First, make sure you have Docker installed and configured.

create a directory and change into it. Add all the logs you want to analyze to this directory. Next, run this command to get yourself Splunk running in a container.

docker run --rm -it \
 -e SPLUNK_START_ARGS=--accept-license \
 -e SPLUNK_PASSWORD=password \
 -e SPLUNK_ADD="monitor /import/*" \
 -p 8000:8000 \
 -v "${PWD}":/import \

Once Splunk has started, provided your docker VM has enough free space (it needs 5GB minimum), navigate to http://localhost:8000 and log in with the credentials admin/password.

The main index should contain all of the logs, ready to analyze with Splunk!

Just Ctrl-C out of the container when done!