If we hear a nearby gunshot, we instinctively react. A mechanic often knows their machine's sound so well that they can diagnose issues by sound alone. While machines can be given analytical capabilities with machine learning (ML), sensing human inputs - like auditory or other sensory data - in a form that machines can understand is challenging. In Splunk, we have been all about making machine data accessible to humans, but what if we flip that and make human data accessible to machines? I take audio captured from live and recorded sources and using Fast Fourier transform feed it into Splunk's Machine Learning Toolkit (MLTK) for classification and anomaly detection. Can we use Splunk to detect gunshots? Can we learn a machine’s normal sounds to detect pending failures? This presentation uses Splunk to apply superhuman ML detection and learning capabilities to human data to show that the MLTK contains accessible tools you can apply to your IT and security problems.
Slides PDF link - https://conf.splunk.com/files/2019/slides/IoT1560.pdf?podcast=1577146207