What are we doing?

Data sitting on a computer somewhere is pretty dull. If you are working with data, it's a good idea to find lots of ways to interact with it. If you work with a type of data that is specific to your field, there'll likely be lots of ways you can think of to interact with it.

For example if it's images, look at them. If you transform your data for any reason, look at them before and after the transformation. It sounds obvious but it can be overlooked by machine learning engineers / data scientists because building tools or bespoke visualisations to interact with data can sometimes feel out of the scope of their responsibilities.

Ok, preaching aside, let's create something that will help people who work with audio within Jupyter notebooks to interact with it. This will allow people working with audio data in Python to listen to their audio alongside any plots they have for the audio e.g. the output of a neural network.

The end goal is to have an interactive audio plot for interacting with audio visualisation plots like this tweet. Credit to this StackOverflow post for sharing a HoloViews audio plot with a playhead.

Here's a version of the final widget that works in a browser. Note: there's a clickable plot if you run it yourself.

 Hear and look at Audio

First things first, we want to be able to hear the audio. Conveniently, IPython comes with lots of out-of-the-box ways to display data. Here's one for audio:

from IPython import display
audio_path = "./my_icons/blah.wav"