Getting the Most Out of Your AKSequencer: Use AKCallbackInstrument


I was recently going over the AKSequencer questions that have been posted to StackOverflow and I noticed a common theme. The answer to a surprisingly large number of questions (and they weren’t just duplicates of the same question) was ‘use AKCallbackInstrument’. Many, if not most, of the questions were from developers with iOS MIDI experience outside of AudioKit.  Seriously, AKCallbackInstrument doesn’t get the kind of love or attention that it deserves. If you’re using AKSequencer, but not using AKCallbackInstrument, you’re missing a very powerful tool. In this post, I’ll go over some ways to get the most out of AKSequencer. But first, for the benefit of those who haven’t used the MIDI sequencer side of AudioKit, or those who have, but haven’t really dug too deeply into the code, here is a brief outline of what AKSequencer is doing.

What is AKSequencer?

At the heart of AKSequencer is Apple’s MusicSequence from the AudioToolbox Framework. This API feels pretty archaic (the earliest reference I could find was this bit of documentation from May 2001). It provided a reasonable way (at the time) to handle MIDI, which was already a decently ancient form of tech. The AudioToolbox Framework gives us ‘MusicSequence’, which is associated with a set of ‘MusicTracks‘, containing MIDI note events (among other events) with a MusicTimeStamp (the position of the event in ‘beats’) which can be iterated over. Each MusicTrack addresses its events to a node in AVAudioEngine (or AUGraph). It is a C API, and even in Objective-C, it is far from user-friendly. Typically, if you wanted to access an event from a MusicTrack, after getting a MusicEventIterator for the track and iterating over to the event you’re interested in, you’d need to make a function call with a signature like this:

And from this, you’d need to ‘look up’ the outEventType byte in the MIDI spec to know what kind of event data you’re dealing with and how it should be interpreted, and then parse out the relevant info for that kind of event (e.g., if it is a note event, then the outEventData would provide bytes for the status, note number, velocity, and channel, but if it is a time signature event, the bytes will be for the top and bottom values for the time signature as well as clocks per quarter note etc. and so on).  In Swift, this is even less fun (unless UnsafeMutablePointers happen to be your kind of thing).

AKSequencer and AKMusicTrack primarily act as Swift wrappers around MusicSequence and MusicTrack respectively, mercifully shielding us from this ugliness and abstracting it to something both Swiftier and easier to wrap our brains around.  But AudioKit also provides a few new abstractions for handling MIDI.

Vanilla AKSequencer

It is easy to get started with AKSequencer.  You can either create a new AKSequencer from scratch, then create one or more AKMusicTracks, and your own note events using add():

 or you can initialise the sequencer directly from a MIDI file: 

In either case, you’ll need to send the AKMusicTrack data somewhere, and for many the default option is to send it directly into an AKMIDISampler, an AKOscillatorBank or whatnot (something with a MIDIEndpointRef) just like we would connect a hardware sequencer into a hardware synth:

AKCallbackInstrument Provides a Better Way

AudioKit introduces some very handy ways of dealing with the MIDI note data in the sequence. And the callback instrument is a good example of this. AKCallbackInstrument has a MIDIEndpointRef, so we can connect it to the AKMusicTrack just like we would with the sampler etc. But it lets us provide a callback where we can specify precisely how we want the MIDI stream handled. Piping our AKMusicTracks into an AKCallbackInstrument, and assigning the callback is dead simple:

Then we just need to provide the callback function itself, which will handle an event consisting of AKMIDIStatus, MIDINoteNumber, and MIDIVelocity parameters (note that MIDINoteNumber and MIDIVelocity are just type aliases for UInt8).  If we are using a sampler, we can just trigger it on or off in the callback based on the status of the MIDI event.

By adding this middle stage into the MIDI flow we get four key benefits:

1) Debugging

Nearly all iOS music devs are also musicians on some level, and as musicians, we appreciate the importance of listening. But we shouldn’t be using our ears as the first line of defence in testing and debugging our code. There are better/cleaner/saner ways of doing it.

A typical scenario: we set up a sequencer and wire it into a sampler, add some note events, and test it out by playing the sequence . . .  only to be greeted with silence (I can’t be the only one that this happens to).  The problem could easily be in either the MIDI stream or the signal chain. We need something better than trial and error to debug the problem. If we are using a callback to handle the MIDI events, we can easily add a breakpoint into the callback function and see where the problem lies.

2) Extending the output options

Maybe we start off by sending our AKMusicTracks to an AKOscillatorBank, but our users are asking us to send MIDI to their DAW or to other apps using either conventional MIDI or Audiobus. At this point, if you’re not already using AKCallbackInstrument, you’re going to need to set it up. So save yourself the headache of having to break your sequencer later by giving it a clean, loosely-coupled implementation right from the start.

3) Getting the UI to respond directly to MIDI data

If you want your UI to respond to the data in your MIDI tracks, this is the place to do it. Let’s say that you are using an AKKeyboardView and you want your sequencer’s playback to animate key presses on the keyboard. You could use the noteOn and noteOff messages to change the colour of the keys on the keyboard. Don’t forget that the MIDI events are being sent on a background thread, so make sure to explicitly do all UI changes on the main thread.

4) Applying logic to the MIDI stream

The callback is the perfect place to add logic to modify the flow of MIDI events coming from your tracks. Add a sustain button with logic that ignores the noteOff messages, add a mute button that ignores the noteOn messages, add a feature that transposes MIDI tracks in real time – all these and more could be easily implemented in the callback function (of course, you’ll probably also want to add some logic to make sure you don’t end up with any stuck notes.)

Two Other Use Cases for AKCallbackInstrument

Your UI needs positional data about MIDI events

Sometimes you want the UI to respond to your sequencer, but you need more information than is contained in your MIDI events alone. For example, perhaps each MIDI event is associated with a UIView on the screen and you want the appropriate UIView to light up when the note is being played.  In this case you could designate a ‘MusicTrack‘ specifically for triggering UI events and send it to its own AKCallbackInstrument.  As of AudioKit 4.2, you can use getMIDINoteData() to get an array of AKMIDINoteData structs for the events on that track.  This data can be used to generate a ‘sister’ track with UI specific events.  We can (perhaps somewhat unintuitively) repurpose the noteNumber and velocity parameters to encode the track number and the index of the note event respectively.

Then in the callback for the UI track:

Other Administrative Duties

Finally, I often find myself creating an ‘admin’ track that will trigger events that don’t relate specifically to the MIDI data I’m playing. This might include checking for cued changes before each loop starts, metronome tick events,  cues for updating a playhead’s position, an event for loop counting and so on. This is closest to the use of MusicEventUserData in the bad old days. Admittedly, it feels a little counterintuitive to be encoding this with MIDI note data, so I find that setting up an enum to make this more readable can really help.

Final Thoughts

When I first started using AKCallbackInstrument, it was only for handling special situations.  But I pretty soon came to the conclusion that it really belongs in even the simplest default AKSequencer implementation. And the more that I worked with it, the more possibilities it presented to me. I know that the common wisdom is ‘you aren’t going to need it’, but in the case of AKSequencer and AKCallbackInstrument, you probably will.  I hope that some of these ideas have been helpful. Happy coding and happy noise making.

Related Posts

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.