iPhone Core Audio Part 3 – Audio Callback

Previous: Setting up the AUGraph.

In the previous two posts, we set up the project and hooked up the audio plumbing. Now we finally get to the actual work. Amazingly, all there’s just one function left to write. The audio render callback is the function you provided earlier to be the input to the mixer AudioUnit.  Whenever the Mixer needs new audio input, it calls the render call back and it is up to you to fill a buffer up with audio samples. It is a C function that  has this specific set of parameters.

  • inRefCon – A pointer to an object that is used to pass in parameters.
  • AudioUnitRenderActionFlags – Indicates special states, we won’t need it here.
  • AudioTimeStamp – Used if you need to synchronize multiple sources.
  • inBusNumber – The specific bus of the Audio Unit that is called the function.
  • inNumberFrames – The number of frames of sample data that will be passed in.
  • ioData – An AudioBufferList, which is a struct containing an array of buffers representing sample data and a count of those buffers.

Here’s what we’re going to be doing.

  • Getting a pointer “THIS” so we can access AudioController variables.
  • Getting a pointer to the buffer we want to write to. (here there is only one buffer, it will be at index [0]).
  • Performing some preliminary setup for the sine wave.
  • Looping through an inNumberFrames length loop, calculating a sine wave and writing sample values to the buffer.
  • Saving any AudioController variables that need to be remembered across calls to the render function.

Continue reading

iPhone Core Audio Part 2 – Setting up the AUGraph

Previous: iPhone Core Audio – Getting Started

In the previous section we went over the tedious details of getting a project set up. Now we are almost ready to start exploring some code. But first, some core audio concepts.

  • Audio Unit – A collection of audio processing code  that is bundled up for reuse in many applications. This is the same concept as a VST or AudioSuite plug-in. On OS X there is a good sized set of built in Audio Units that do common tasks like filtering or reverb and you can install 3rd party AU plug-ins. On iPhone OS the built-in set is more limited. There is not (yet) any way to load 3rd party AudioUnits.
  • AUGraph – A set of functions for loading, managing, and connecting AudioUnits into a signal processing system.
  • remoteIO – An audio unit that connects to the iPhone’s audio input and output hardware.
  • AudioComponentDescription – A struct used to identify an AudioUnit.
  • AudioStreamBasicDescription – A struct that holds information about constant bitrate audio that will be transferred; either to a file, through audio units, or out to audio hardware.
  • CAStreamBasicDescription –  A wrapper around the AudioStreamBasicDescription struct that provides some convenience methods for modifying values and printing.

Inside of the AudioController class we will set up an AUGraph containing a remoteIO audio unit and a multi-channel mixer audio unit. In your AudioController.h file you will need to:

  • Import the AudioToolbox
  • Import  CAStreamBasicDescription.h
  • Create instance variables to represent an AUGraph and a Mixer audio unit.
  • Add methods to initialize, start and stop the AUGraph.

Continue reading

iPhone Core Audio Part 1 – Getting Started

After I’ve been I working on a project for a while, I tend to tuck away and forget about the low level plumbing details I needed to get the project off the ground. Consequently, when I want to quickly try out a new idea in a new project, I end up wasting time remembering all of the i’s I needed to dot and t’s I needed to cross. So, as I am at the beginning of another project, I thought I would make a checklist for myself and share it with you along with a little explanation of why each step is necessary. Hopefully it is useful for some of you out there as a “Hello World” tutorial of sorts.

Core Audio is complicated, there’s no getting around it. It is capable of some amazing things, but getting a simple project started requires synthesizing a large and disparate set of concepts, documentation, mailing lists, and code samples. While Apple’s documentation is light years better now than it was back in the Panther/Tiger era, it still doesn’t do a great job of getting someone started.  These are the steps I use to build an iPhone project that uses the remoteIO AudioUnit. RemoteIO is the system to use if you need low latency audio input/output for tasks such as a synthesizer app. If you just need to play a few sound effects, you should check out SystemAudioServices or OpenAL. If want to stream large media files you should check out AudioQueueServices.

There are 3 basic tasks that need to be addressed to get live audio going.

  1. Setting up the XCode project so that it has all of the necessary files and libraries. (Appeasing the #include Gods.)
  2. Initializing and connecting various audio objects. (Setting up the plumbing.)
  3. Creating an audio callback function. (Where your work gets done.)

This project will perform the simplest audio task possible, play a solid 600 Hz sine wave.

Continue reading