iPhone Core Audio Part 3 – Audio Callback

Previous: Setting up the AUGraph.

In the previous two posts, we set up the project and hooked up the audio plumbing. Now we finally get to the actual work. Amazingly, all there’s just one function left to write. The audio render callback is the function you provided earlier to be the input to the mixer AudioUnit.  Whenever the Mixer needs new audio input, it calls the render call back and it is up to you to fill a buffer up with audio samples. It is a C function that  has this specific set of parameters.

  • inRefCon – A pointer to an object that is used to pass in parameters.
  • AudioUnitRenderActionFlags – Indicates special states, we won’t need it here.
  • AudioTimeStamp – Used if you need to synchronize multiple sources.
  • inBusNumber – The specific bus of the Audio Unit that is called the function.
  • inNumberFrames – The number of frames of sample data that will be passed in.
  • ioData – An AudioBufferList, which is a struct containing an array of buffers representing sample data and a count of those buffers.

Here’s what we’re going to be doing.

  • Getting a pointer “THIS” so we can access AudioController variables.
  • Getting a pointer to the buffer we want to write to. (here there is only one buffer, it will be at index [0]).
  • Performing some preliminary setup for the sine wave.
  • Looping through an inNumberFrames length loop, calculating a sine wave and writing sample values to the buffer.
  • Saving any AudioController variables that need to be remembered across calls to the render function.

Continue reading

iPhone Core Audio Part 2 – Setting up the AUGraph

Previous: iPhone Core Audio – Getting Started

In the previous section we went over the tedious details of getting a project set up. Now we are almost ready to start exploring some code. But first, some core audio concepts.

  • Audio Unit – A collection of audio processing code  that is bundled up for reuse in many applications. This is the same concept as a VST or AudioSuite plug-in. On OS X there is a good sized set of built in Audio Units that do common tasks like filtering or reverb and you can install 3rd party AU plug-ins. On iPhone OS the built-in set is more limited. There is not (yet) any way to load 3rd party AudioUnits.
  • AUGraph – A set of functions for loading, managing, and connecting AudioUnits into a signal processing system.
  • remoteIO – An audio unit that connects to the iPhone’s audio input and output hardware.
  • AudioComponentDescription – A struct used to identify an AudioUnit.
  • AudioStreamBasicDescription – A struct that holds information about constant bitrate audio that will be transferred; either to a file, through audio units, or out to audio hardware.
  • CAStreamBasicDescription –  A wrapper around the AudioStreamBasicDescription struct that provides some convenience methods for modifying values and printing.

Inside of the AudioController class we will set up an AUGraph containing a remoteIO audio unit and a multi-channel mixer audio unit. In your AudioController.h file you will need to:

  • Import the AudioToolbox
  • Import  CAStreamBasicDescription.h
  • Create instance variables to represent an AUGraph and a Mixer audio unit.
  • Add methods to initialize, start and stop the AUGraph.

Continue reading