Face Maker’s 3D Augmented Reality Masks, Built with AudioKit

Remember those great audio visualizations from music player software back-in-the-day? Imagine those on your iPhone X with a TrueDepth camera and ARKit. Particle Masks in Face Maker 1.2 proudly uses AudioKit to make 3D music visualizations on faces.

The visual effects adapt to the rhythm and nuances of music to create a vibrant, LED-like effect by changing particle density and colors in a way that’s entrancing and mesmerizing.

Developer Tim Sears has taken an interesting step forward for ARKit and AudioKit. Pushing what is possible with the TrueDepth camera on iPhone X.

I had written a custom renderer based on ParticleLab so that I could take the raw texture from the shader and wire it right into SceneKit and ARKit. What I needed to do was have a listener to the callbacks from my renderer so that I could use AudioKit to influence the gravity of the renderer.

I ended up using an AKMicrophone to capture the raw audio and set it up to not have any output. I then created a couple of properties in my class to keep track of the current amplitude and Fast Fourier Transform. These were of the AKAmplitudeTracker and AKFFTTap types, respectively. — Developer Tim Sears

Tim was inspired by Simon Gladman’s example Particle Visualization project.




With a day or two of hacking and using Apple’s Face AR sample project and AudioKit as a starting point,  you could build a similarly interesting Particle Face implementation!

Thanks for sharing your experience and awesome work, Tim!

Download the app now:
https://itunes.apple.com/us/app/face-maker-augmented-reality/id1329976459?ls=1&mt=8

Related Posts

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.