How to Integrate Ableton Link with AudioKit using Swift or C++

ableton

Ableton Link is a great technology for syncing iOS audio apps with each other as well as Ableton Live. A link session tracks the beat, bar, and phrase according to tempo so that apps can play in time with a shared tempo.

LinkKit is a C++ library with functions for creating and joining a Link session, getting the beat for the current time, updating the tempo etc. from both the audio thread and UI thread. The best practice is creating it once when you initialize your app with your audio engine and tracking the beat or requesting session updates in the audio render loop. Alternatively, you can setup your custom timer to act like an audio render loop. But, it does not guarantee precise accuracy.

I recommend reading the Link documentation for an in-depth overview before we start.

Getting LinkKit

You need to request LinkKit from Ableton and they will give you access to the private repo of LinkKit where you can download the final release with UI guidelines and a well-built example app called LinkHut. You can study the example app. If you have no experience with AudioUnits, Audio DSP programming, mixing Objective-C with C++ (Objective-C++), this example will help. Don’t worry if you are a Swift programmer, it works perfectly well with Swift apps. Also, you can test your implementation with the LinkHut to check if your app is in sync with other Link apps.

AudioKit Integration

Ableton Link works great with AudioKit. If you have an Audio Unit where you can subscribe its render loop (the audio thread), you are basically ready to go. Also, it doesn’t hurt to add one if you don’t currently have one. Otherwise, you can setup your custom loop with the NSTimer, for example, if you are using AudioKit for just the AKSequencer.

Implementation

There are two main data structures you will work with. An engine data, where the audio engine related data goes, like, output latency, bpm, quantum etc.

And Link data, the data you need in audio thread. It will have the ABLLinkRef, the Link itself and two EngineData references one is the audio thread only data and other is shared between main and audio thread so changes on EngineData from different threads don’t block each other. Also, it has time-related data for calculating the beats accurate as possible.

In audio render loop, we sync sharedEngineData to localEngineData with the static void pullEngineData(LinkData* linkData, EngineData* output) function. So, tempo or play/stop state changes applied on audio thread.

And in the last part, in our DSP code, or audio render loop, we pull the Link data, capture the current session, check the tempo, play/stop changes and commit any requested changes by our client to the session.

When we setup our audio render loop function to audio unit, we should send our LinkData. So, we can pull it from void *inRefCon parameter in our audioCallback function.

Passing custom data

You can pass your custom data type as well, if you want to do something else. For example, you can setup a custom callback function like:

And setup a property for implementing it in another class.

In your audio engine’s header file. Also, you should setup a custom struct for passing it to your audio callback function.

Create a private reference for it in your implementation file.

And pass it to the audio callback method.

Then, in your audio callback function, pull it from void *inRefCon.

You can pass the current beat to your custom callback function now.

You can find the full example in this repo.

Integrating with Swift projects

For Swift projects, you need a bridging header file and include the LinkKit.

Also, you need to link libc++.tbd to your linked frameworks and libraries section in the General tab in your project settings.

There is a Swift port of EngineData and LinkData as Swift structs that you can grab from here.

In this example, we will setup a custom timer and use it as our audio render loop. So, it will cover the projects with no audio units. The update function in the Swift port code is the same code in the audio render loop above. So, our custom timer should call it.

If you subscribe its listeners, you are going the react tempo/start/stop changes. And of course, you can request the change them as well, by simply setting its properties.

For more information on Ableton link, please visit their website.
(Blog photo courtesy of Ableton Link website)

Related Posts

Comments (15)

I’m trying to install the full example, but getting this error:

Undefined symbols for architecture x86_64:
“_ABLLinkSetIsPlaying”, referenced from:
audioCallback(void*, unsigned int*, AudioTimeStamp const*, unsigned int, unsigned int, AudioBufferList*) in AudioEngine.o
“_OBJC_CLASS_$_ABLLinkSettingsViewController”, referenced from:
objc-class-ref in AudioEngineManager.o
“_ABLLinkRequestBeatAtStartPlayingTime”, referenced from:
audioCallback(void*, unsigned int*, AudioTimeStamp const*, unsigned int, unsigned int, AudioBufferList*) in AudioEngine.o
“_ABLLinkCommitAudioSessionState”, referenced from:
audioCallback(void*, unsigned int*, AudioTimeStamp const*, unsigned int, unsigned int, AudioBufferList*) in AudioEngine.o
“_ABLLinkSetSessionTempoCallback”, referenced from:
-[AudioEngine initLinkData:] in AudioEngine.o
“_ABLLinkCaptureAudioSessionState”, referenced from:
audioCallback(void*, unsigned int*, AudioTimeStamp const*, unsigned int, unsigned int, AudioBufferList*) in AudioEngine.o
“_ABLLinkIsPlaying”, referenced from:
-[AudioEngine isPlaying] in AudioEngine.o
audioCallback(void*, unsigned int*, AudioTimeStamp const*, unsigned int, unsigned int, AudioBufferList*) in AudioEngine.o
“_ABLLinkGetTempo”, referenced from:
-[AudioEngine bpm] in AudioEngine.o

Does this example works for non Audio Unit projects ?

Do we need to implemente the audioCallback and the LinkData struct to swift non Audio Unit projects ? I’m totally lost

It for both actually. If you have an Audio Unit, than you should use the audio render loop of that audio unit to implement Link. If you have not, you gonna need a custom timer and use it as a render loop. Tutorial covers both. You need the LinkData struct in both cases.

Hey @RICK,
It was a cocoapods issue which is fixed now. You can build it without any errors now.

Hey Cem,

I‘m trying to run AuSequencer but getting this error in Xcode:

“ld: library not found for -lABLLink
clang: error: linker command failed with exit code 1 (use -v to see invocation)“

Link documentation says:

“libABLLink.a: A static library containing the implementation of Link. This file is not in the repo – you must download a release to get it.„

I don‘t understand what release I have to download or where.

Thanks in advance for any help.

Hi Guido,

Yes, LinkKit is a private SDK and you need to request it from Ableton by filling a form, here.
https://ableton.github.io/linkkit/

They will give you an access to their private Github repo where LinkKit lives. Then you should download the lastest release and import it to your project.

Thanks. Now, in an non Audio Unit project, how to implement the render loop ? The timer is triggering the update function, and it should throw the render loop… But is in C … should It be located another class ? I’m lost here, could you give some clue? An example would be really appreciated

Ok, so… the #selector(update) on timer is the function at ABLLinkManager, from your Swift port. But I’m getting two errors:

1.
timer = Timer.scheduledTimer(
timeInterval: timerSpeed,
target: ABLLinkManager.shared,
selector: #selector(update),
userInfo: nil,
repeats: true)

Error … Use of unresolved identifier ‘update’

2.
ABLLinkManager.shared.add(listener: .connection({ isConnected in
if isConnected {
ABLLinkManager.shared.start()
} else {
ABLLinkManager.shared.stop()
}
}))

Error … Value of type ‘ABLLinkManager’ has no member ‘start
Error …. Value of type ‘ABLLinkManager’ has no member ‘stop’

If anyone having this issue, start() and stop() functions where removed in previous ABLLinkManager.swift editions. The example in GitHub is not updated. It just start or stop the timer

How would you recommend synchronising AKSequencer to the beat time of Ableton Link (i.e. starting the sequencer on beat).

You should check the beatTime, if it’s less then zero, you need to wait unitl the next bar for starting/playing the sequence.
You can calculate the playing beats in the current bar by this code. as the LinkHut example also demonstrates.


floor(fmod(_quantum + beatTime, _quantum))

Thanks for the quick response – that was my approach, I set up a timer that checked every 1ms the playing beat and I also had a flag set for shouldPlay when the sequencer started in the UI. In the timer callback I checked if shouldPlay is true and if so, checked if we’re at the start of the bar.
This worked in some cases but most of the time the sequencer would start a little off beat (noticeable by ear). I’m guessing this has to do with me using a timer instead of an audio render loop.

How do you recommend to send requestStart & requestStop from Swift project ?

Hello! I’m currently working on integrating Ableton Link with AKSequencer and having the same issue as you. Have you found any solution?

Here is how I did it. My general setup is quite similar to the AudioKit Synth One wich means Swift -> Objective C -> C++. I was confused about the AudioUnitSetProperty as well and figured out one doesn’t need it. Instead you can also initiate AbletonLink directly in the init method of my core c++ audio file. Like this:

void abletonSetup(){
struct mach_timebase_info timeInfo = mach_timebase_info_data_t();
mach_timebase_info(&timeInfo);

ABLLinkRef linkRef = ABLLinkNew(bpm);
EngineData sharedEngineData = EngineData();
EngineData localEngineData = EngineData();

_linkData = LinkData();
_linkData.ablLink = linkRef;
_linkData.sampleRate = sampleRate;
_linkData.secondsToHostTime = (1.0e9 * Float64(timeInfo.denom)) / Float64(timeInfo.numer);
_linkData.sharedEngineData = sharedEngineData;
_linkData.localEngineData = localEngineData;
_linkData.timeAtLastClick = 0;
_linkData.isPlaying = false;
}

I copied pullEngineData() and audioCallback() and the structs from this tutorial into c++. Then created a public variable of the type LinkData and made sure that audioCallback() is not depending on any incoming data through parameters. For example:

UInt64 hostTimeAtBufferBegin = mach_absolute_time() + engineData.outputLatency;

Then just call audioCallback() from your main audio thread process function.
It took me a while to understand that audioCallback() is the place to put my own app logic in.
For counting in and starting in ( sync ) I use the solution from @cemolcay. Thanks!

In order to show Abeltons Settings ( ABLLinkSettingsViewController ) in the frontend, Swift now references to the ablLinkRef( _linkData.ablLink ) from my c++ class via bridging functions.

Finally there is this amazing tutorial for adding audioUnit / Audiobus to your app:
https://audiokit.io/audiobus/

Audiobus with AbletonLink combined works pretty well.

Sometimes there is a bit of latency though.
I am not sure yet where to add the latency, which is introduced by the software of my app.
Should it be added to _linkData.sharedEngineData.outputLatency or only to hostTimeAtBufferBegin inside the audioCallback() function? What is its format? I guess it’s samples? What’s with the buffer size? *confused

Comment to cemolcay Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.