midi au slider

AUv3 MIDI Plugins for iOS Developers

Abstract: this text provides tips, background information and considerations for iOS developers wishing to create AUv3 MIDI plugins or to support MIDI plugins in AU host software. It assumes familiarity with ‘normal’ AUv3 plugin development.

Edit Feb.’18: Apple have clarified some of the open questions I had. I have made some amendmends to the text below to reflect these new insights.

Introduction

One of the most interesting music related novelties quietly introduced in iOS11 is the ability for AUv3 Extension plugins to send MIDI messages to their host. It was first demonstrated in Apple’s WWDC’17 CoreAudio video: 

https://developer.apple.com/videos/play/wwdc2017/501/

(Jump to 39:00 to hear the explanation and 44:00 to see a demo)

Apparently, Apple’s main use case for the feature was to support AU instruments that have their own virtual keyboard (or other performance UI). Users can then record their performance in any compatible AU host.

However, the feature also opens doors for custom sequencers and other MIDI generators that don’t necessarily need their own sound source or synth engine. In my opinion an even richer application angle than the one shown in the WWDC video as it supports my personal belief that musicians should be able to mix and match sequencers and sound sources – rather than having a sequencer tied to a sound source.

The Rozeta sequencer suite is one first practical example of where I see lots of potential for MIDI AU plugins.

A. For Host Developers

AUv3 MIDI plugins are essentially identical to normal AU instrument plugins. They provide audio I/O buses (which they can ignore) and expect render calls at frequent intervals. So if you already have a working infrastructure for AU instruments, you’re 80% done.

Edit Feb.’18: AU MIDI plugins can have two different identifiers: ‘aumu’ (like regular AU instruments) can combine audio output with MIDI output. Alternatively, ‘aumi’ plugins are pure MIDI processor plugins. Structurally they are identical (including audio buses) but they are not expected to generate or process audio. Just MIDI.

Edit May ’18: AU MIDI plugins can theoretically also come in the form of effects plugins (‘aufx’ or ‘aufm’) as demonstrated by e.g. FAC Envolver and Ruismaker Perforator. Very few hosts support this at the time of writing.

On top of the regular AU functionality, AU MIDI plugins are given a ‘block’ by the host which they can invoke from the regular render call. This block, called midiOutputEventBlock, is used by the plugin to send a list of MIDI packets back to the host.

Loading AU MIDI plugins

Hosts typically take the following steps to load and handle AU MIDI extensions:

  1. instantiate the AU extension
  2. check for the existence of “midiOutputNames“. This is an array of NSStrings. If this field is nil there will not be MIDI output from this plugin. If it is set it contains the name(s) of the virtual MIDI output cable(s). This is your indication that the plugin intends to send MIDI to the host. Note: Rozeta only uses one output cable.
  3. before any call to allocateRenderResourcesAndReturnError on the plugin the host must provide these three important blocks:
    • midiOutputEventBlock
    • musicalContextBlock
    • transportStateBlock
  4. during each call to the plugin’s renderBlock, the plugin can call the cached midiOutputEventBlock to send MIDI events back to the host. Only one call will be made per virtual MIDI cable, so all MIDI events for this buffer duration will be combined into a single MIDI packet list. This list has the same format as the MIDI packet list sent from the host to the AU.
  5. important: information passed to the plugin using the transportStateBlock and musicalContextBlock must be complete and as accurate as possible, as these will be used by the plugin to determine the host’s sequencer state and how to sync MIDI events to the host’s timeline! I recommend using double precision floats for any time related calculation and to be extra careful with rounding.

Detecting AU MIDI plugins

As described above, AU MIDI plugins can be easily identified after the AUv3 has been instantiated. Simply look if the field “midiOutputNames” is present. If it is, you’re dealing with an AU MIDI plugin.

However, for hosts it can also be interesting to know before instantiating if an extension is capable of sending MIDI. E.g. for creating lists of available MIDI plugins for their users to choose from.

There is an official method for identifying MIDI-capable plugins: each AU extension has an optional boolean field named “hasMIDIOutput“. Unfortunately this mechanism is useless as it defaults to YES, meaning virtually all AU instrument plugins in the field today report YES when queried.

Edit Feb.’18: MIDI processors (plugins which do not output audio) are supposed to use the ‘aumi’ identifier, making them easy to identify on the system. However, at the time of writing, only one AU host supports this plugin format (AUM). 

Event timing

Take note that MIDI events received from plugins will have absolute timestamps. Whereas MIDI events sent from the host to the plugin typically have relative offsets (AUEventSampleTimeImmediate + frameOffset) this is different in the other direction! To get the frame offset, subtract the current timestamp from the MIDI event’s timestamp:

frameOffset = eventTimeStamp->mSampleTime – now->mSampleTime

B. For plugin developers

AU Instrument vs. AU MIDI Processor

In theory there are 2 valid AU MIDI formats. Apple presented the “aumu” type in its WWDC’17 CoreAudio video. Also known as “kAudioUnitType_MusicDevice“, this is the regular AU instrument plugin type we already knew. It allows plugins to output audio, MIDI or both (Moog Model 15 is an example of a plugin that does both). This is also the format implemented in Apple’s Garageband, and most other AU MIDI compatible hosts.

Additionally, there exists another format: “aumi“, or “kAudioUnitType_MIDIProcessor“, which can be found in the CoreAudio header files. While this sounds like a more suitable format for MIDI plugins, it comes with several potential problems and risks; such as being completely undocumented by Apple and therefore possibly misinterpreted on a practical and/or conceptual level by the iOS developer community.

Edit Feb.’18: Apple have clarified the use of the ‘aumi’ format. This is the preferred format for AU plugins which do not output audio. Apart from the identifier (‘aumi’ vs. ‘aumu’) the plugin formats are structurally identical. Both feature audio buses – which can be used to derive sample rates and other information about the current audio session.

At the time of writing only one AU host supports this MIDI processor format (AUM). Until adoption has broadened and Apple have formalized ‘aumi’ in their documentation Rozeta will remain based on the more ubiquitous ‘aumu’ format – even though there is obviously no audio output.

This is what’s described below. Should you want to experiment with the ‘aumi’ format, simply change the type identifier. Everything else should be identical.

MIDI AU Basics

As explained above, AU MIDI plugins are quite simply stripped AU Instrument plugins with some added functionality for sending MIDI back to the AU host. If you already have a working AUv3 instrument you have a good starting point. You can ignore the audio processing part of the code, but you still need to set up the I/O buses to be able to access the current samplerate of your host. 

  • Set up the midiOutputNames array property of your AudioUnit: this is an array of strings and is used by hosts to determine if your plugin intends to send MIDI. Every item in the array represents the name of a virtual MIDI cable (Rozeta plugins only use a single virtual cable).
  • during any call to “allocateRenderResourcesAndReturnError” you must cache (or re-cache) three blocks provided to you by the host:
    • midiOutputEventBlock
    • musicalContextBlock
    • transportStateBlock
  • whenever the host calls your renderBlock you can build a MIDI packet list (using the same format as the MIDI messages sent from the host to the AU). When you’re done, send them to the host – in a single call per virtual cable – using the cached midiOutputEventBlock

Event timing

Take note that MIDI events sent from plugin to host will have absolute timestamps. Whereas MIDI events sent from the host to the plugin typically have relative offsets (AUEventSampleTimeImmediate + frameOffset) this is different in the other direction! Plugins are supposed to send absolute timestamps based on the timestamp received by the rendercall:

eventTimeStamp = inTimeStamp->mSampleTime + frameOffset

 

That’s it in a nutshell. As you can see it’s only a small extra step to make an AU MIDI plugin if you’re already familiar with the AUv3 instrument architecture.

 

Acknowledgement

I would like to thank Jonatan Liljedahl (of Kymatica/AUM fame) for pioneering the MIDI AU format with me. Not only was AUM the first host to support AU MIDI plugins with full MIDI routing, but he also put considerable time and brainpower into thinking about the best (and most futureproof) ways to turn this mostly undocumented format into something meaningful and useful for iOS. 

Additional thanks go out to Mathieu Garcia of INTUA for experimenting, sparring and thinking along as the exact specs emerged from the mists of vagueness.