Sensie’s Sonic Lab, Inspired by Waves & Brian Eno: Built with AudioKit




Sound Designer Lorna Dune and developer Max Maksutovic are developing Sensie’s Sonic Laboratory. Partially inspired by Brian Eno, breathwork and ocean waves, Sensie combines data from iPhone IMUs, biofeedback principles, and sound healing tools. This allows you to track your stillness in a way that evokes awe, curiosity and interest in oneself, ideally leaving more present. Sensie combines art and technology to give you a musical/scientific instrument. Plus, it’s built using the AudioKit open-source code library!

Lorna and Max shared great advice about the making of the app!
Read the exclusive interview below:



WHAT WAS YOUR INSPIRATION FOR MAKING THE APP?

LORNA: Waves. Ocean waves, breathwork, the motion of water, radio waves, interference patterns between waves and how to communicate with our own brain wave patterns (binaural, monaural beats). Other inspirations include Tai chi, qigong, chakra healing, sound baths, sacred frequencies, the elements, Nature. Musically speaking my inspirations were Brian Eno’s ‘Music for Airports’, Indian ragas, Overtone singing and Pythagorean harmonic exploration. 

ANY ADVICE FOR WOULD-BE MUSIC APP DEVELOPERS?

LORNA: Check out AudioKit’s open-source audio framework tools as well as their iOS synthesizers. SynthOne is one of the most advanced software synths I’ve played. It has a real analog feel to it and it’s free on iOS. I’m a sound designer, so along with my audio programming partner, Maximilian Maksutovic, we used the AudioKit modules in Swift to create signal chains, effects, eqs, and compression to build our musically interactive application.

I hope more app developers take advantage of these open-source resources!  

MAX: I found myself getting a little paralyzed at the beginning of this project with “trying to do the *right* thing”, which was a canard. Just start experimenting and building your signal chains, one step at a time. If values aren’t making sense, or the signal chain you imagine in your head or prototyped in a daw isn’t working, go back to the basics and build it one step at a time. Lorna was incredibly patient with my lack of experience with Operations, and as soon as I let go of trying to get the entire signal chain perfect from the get go, we were able to iterate so much faster. Also the AudioKit community is really a treasure, full of incredibly talented and kind people, which makes developing native audio experience on iOS such a joy.



WHAT DID YOU LEARN FROM MAKING THE APP?

LORNA: We had some pretty major discoveries utilizing the iPhone’s sensor data to create gesturally-interactive sound design. Do you know how many sensors we have in our phones?? So many! Rotation, Acceleration, Gravity, Magnetometers and many more. 

We’ve been working to turn this pile of data into interactive gestural healing sounds in an experimental exploration with the Sensie Sonic Laboratory team. Ideally, you can lose yourself in imagination or meditation while experiencing the feeling of striking a large drum or gong or gently moving water around you. The sounds respond to your movements in a unique and joyful way. Building this app has been inspiring me to start practicing qigong at home and has taught me to practice more regular mind-body check-ins to see where I’m feeling stuck.

I was wearing magnets on my ears, hands and feet to help move qi through blocked meridians, as applied by my acupuncturist around the Summer Solstice. Trying to get that toroidal flow going, you know? I think about the Earth’s magnetic field and our relationship to it in an ever climate-changing, technologically-advancing time. 

So why not create a synthesizer that responds to magnetic fields?! Using AudioKit’s Synth One (my new fav iOS synth) as well as the Modal Resonant Filter and other fun tricks, the instrument brings out harmonics depending on a) which direction you are facing and b) if there are electromagnetic fields around you, which there many! A special sound unlocks when facing North. 

I’ve learned so much about rotational physics and magnetic fields as well as the amazing possibilities of audio programming using AudioKit’s open-source audio framework tools. We’ve been exploring scientific concepts in an artistically conceptual way which has been fun, playful, meditative and intuitive. We hope others feel the same.

MAX: Oh so much! CoreMotion is really an incredible API: it is easy to start learning, but there is just such depth to what a developer can do. In addition, there really isn’t alot of indepth explorations in forms of tutorials or retrospectives. It is *alot* of trial and error to get an intuitive understanding of how the sensors work and how they can be useful, in our particular use case to map the values to coherent AudioKit node parameter values. Once we found some good value conversion mapping between CoreMotion and the node parameters, it was like magic hearing these synths and samplers trigger based motion of the iOS device. Lorna has an incredible ear and was able to build out these really beautiful signal chains with just a handful of components. Many of which relied on LFOs so it was my foray into SporthAudioKit, which much thanks to some consulting hours with Aurelius we were able to build these incredibly dynamic oscillators using Operations. I hope to contribute and expose some more documentation so that other AudioKit developers can start experimenting and augmented SporthAudioKit.

WHAT WERE THE BIGGEST CHALLENGES?


LORNA:
How to combine sensor classes to represent 3D space. Most of the data we were receiving measured change only meaning that the system doesn’t “remember” where you moved from and return to that place. The magnetometers provided constant data which broke open some new gesture mappings. There is much more to be explored in these areas.

We were aiming for a premiere at Barcelona’s “Festival of Consciousness” on July 29th, so our research and development period was quite short before building. The discoveries and implementation of ideas is astounding considering the schedule. 

MAX: Looking back now Lorna is so right, we had such little time but managed to make a really magical experience, and that really speaks to not only what a brilliant development platform Swift and iOS is, but specifically to AudioKit. There would have been no way we could have met our deadline had we needed to build all of these components by hand in CoreAudio. AudioKit is just extraordinarily powerful and expressive in just a handful lines of code. 

There were definitely some challenges keeping track of threads and making sure not to do needless calculations and updating the audio components. As we wanted the IMUs (motion sensors) to update at their maximum frequency (100hz), that meant that the closure from CMManager was firing every 0.01 seconds so there had to be special considerations for when there were expensive calculations happening in the gesture mapping or updating audio components. Another challenge was being mindful that the accelerator and gyroscope report rates of change, so after you move the phone and go back to a place of rest, those values will report rest, which was a little unintuitive for me in trying to treat this as a gestural instrument. 

WHAT’S NEXT FOR YOU?

LORNA: I’d love to further explore interactive sound design through gesture using a potentially hands-free device (magnets? bluetooth?), then develop for Android and hopefully get the chance to create a few more interactive sound worlds.

In the meantime, I’m wrapping up a development phase with SoundSelf, a pioneering software company that is creating immersive digital therapy experiences with interactive musical systems that respond to the sound of your voice, LED photic stimulation and vibration therapy. It weaves the disciplines of hypnosis, ceremony, meditation, and neuroscience into the fabric of a videogame. The next development phase is for VR. 

MAX: I feel we just scratched the surface of what is possible with integrating CoreMotion with AudioKit, and there are some truly groundbreaking possibilities for gestural music making/sonic exploration. Looking forward to continuing our work, also starting to integrate more sensors like GPS, microphone, LIDAR and others to enhance what is possible for an individual to interact sonically with their space and their bodies. We are also considering ways to network devices together to create a shared sonic space through movement which I think would be an incredible experience. 

Until then I’m continuing to build other music technology products for clients around the world using iOS and AudioKit at Allegro Software.

Thanks Lorna & Max!


Download the Sensie Sonic Lab app above

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.