iPhone Core Audio Part 1 – Getting Started

After I’ve been I working on a project for a while, I tend to tuck away and forget about the low level plumbing details I needed to get the project off the ground. Consequently, when I want to quickly try out a new idea in a new project, I end up wasting time remembering all of the i’s I needed to dot and t’s I needed to cross. So, as I am at the beginning of another project, I thought I would make a checklist for myself and share it with you along with a little explanation of why each step is necessary. Hopefully it is useful for some of you out there as a “Hello World” tutorial of sorts.

Core Audio is complicated, there’s no getting around it. It is capable of some amazing things, but getting a simple project started requires synthesizing a large and disparate set of concepts, documentation, mailing lists, and code samples. While Apple’s documentation is light years better now than it was back in the Panther/Tiger era, it still doesn’t do a great job of getting someone started.  These are the steps I use to build an iPhone project that uses the remoteIO AudioUnit. RemoteIO is the system to use if you need low latency audio input/output for tasks such as a synthesizer app. If you just need to play a few sound effects, you should check out SystemAudioServices or OpenAL. If want to stream large media files you should check out AudioQueueServices.

There are 3 basic tasks that need to be addressed to get live audio going.

  1. Setting up the XCode project so that it has all of the necessary files and libraries. (Appeasing the #include Gods.)
  2. Initializing and connecting various audio objects. (Setting up the plumbing.)
  3. Creating an audio callback function. (Where your work gets done.)

This project will perform the simplest audio task possible, play a solid 600 Hz sine wave.

Let’s get started. I’m currently using iPhone SDK 3.1.3.

1. In XCode, go to the file menu and start a new iPhone project in XCode. I used a view based template and called it “iPhoneAudio.”

2. Link to the AudioToolbox framework. AudioToolbox contains API’s for a bunch of useful things like writing files to disk, describing audio streams, and converting audio formats.

  • Right click and get info on your application target.
  • In the general tab click the plus to add a framework.
  • Select AudioToolbox.framework from the list.

3. Add some support files from PublicUtility to your project overview.  None of these are strictly required, but a lot of sample code uses them.  Apple’s iPhoneMultichannelMixerTest makes a group in “Other Sources” called “iPublicUtility” so I follow that lead.

  • Drag the following files from /Developer/Extras/CoreAudio/PublicUtility on your system to your project overview in XCode.

CAStreamBasicDescription.h
CAStreamBasicDescription.cpp
CAMath.h
CAXException.h
CADebugMacros.h

  • Apple has the files copied locally into the project directory, I also follow this lead here but theoretically this may prevent you from building using updated versions as they are released.

4. Now, we’re going to create a class to be the central place for you to handle audio code.

  • In XCode select New File… from the File menu.
  • In the CocoaTouch section select an Objective-C class and make it a subclass of NSObject. Click Next
  • Call the files AudioController or some other fitting name and have it make both the .h and the .m file.

5. Working in CoreAudio entails jumping between Objective-C and C/C++. Fortunately, since they are all related this is pretty easy to do, at least from the compiler’s standpoint you just have to tell it beforehand.

If you don’t you will get unhelpful errors like this:

“Expected ‘=’, ‘,’, ‘;’, ‘asm’ or __attribute__’ before”

I forget to do this at least once in every project and then I spend at least 20 minutes trying to figure out why the darn thing won’t compile. I’ve found that every class that imports AudioController.h needs to be an Obj-C/C++ file. And every class that imports those classes do as well, all the way up the importing chain. Apple’s examples somehow don’t do this, but I don’t know how they get around it. (I would love to be enlightened on this.) So go ahead and change it to iPhoneAudioViewController.mm and iPhoneAudioAppDelegate.mm.

[Edit: Commenter arf writes: "if you add “-x objective-c++” to the OTHER C FLAGS in your build settings, everything will be compiled as ObjC++"]

  • Tell the compiler this will be an Objective-C++ file by by changing the .m file extension to .mm
  • At this point your Overview should look like this:

6. Now we need to create an instance of your AudioController. There’s lots of ways to do this, but we’re going to have the AudioController instantiated by the view controller’s .xib file. First we need to add an IBOutlet so that Interface Builder will know how to make connections to the audioController.

  • Add the line #import “AudioController.h” to  iPhoneAudioViewController.h so the view controller knows about your class.
  • Add instance variables and properties to your main view Controller.
  • //  iPhoneAudioViewController.h
    //  iPhoneAudio
    #import <UIKit/UIKit.h>
    #import "AudioController.h"
    
    @interface iPhoneAudioViewController : UIViewController {
    
    	IBOutlet AudioController *audioController;
    }
    
    @property (readonly, nonatomic) AudioController *audioController;
    @end
  • In iPhoneAudioViewController.mm synthesize your accessor methods.
  • @implementation iPhoneAudioViewController
    
    @synthesize audioController;
  • Remember to release it in the dealloc method. [audioController release];
  • Save all the files and continue.  You might also try building to check and see if  you have things right so far.

7. Now that we have an IBOutlet we can make an instance of AudioController and make some connections.

  • Open iPhoneAudioViewController.xib in Interface Builder.
  • Open the Library tool (Shift-CMD-L) and drag an NSObject object out onto your .xib .
  • Select the new object and then open the Inspector (Shift-CMD-I). Change it’s class to AudioController and give it a descriptive name.
  • Ctrl-click and drag a connection from File’s Owner to the AudioController object. Select the audioController outlet. Now, when the .xib is loaded the view controller will have a way to talk to the audioController.
  • Save the .xib file and close for now.

Next: iPhone Core Audio Part 2 – Setting up the AUGraph.

About these ads

10 thoughts on “iPhone Core Audio Part 1 – Getting Started

  1. if you add “-x objective-c++” to the OTHER C FLAGS in your build settings, everything will be compiled as ObjC++

  2. Pingback: Decibel metering from an iPhone audio unit | Politepix

  3. Pingback: iPhone Core Audio « Tim Bolstad

  4. thank you for this post. it did help me.
    regarding the combination of cpp and Obj-c you only need to change the ‘file type’ of the view controller to sourcecode.cpp.objcpp. this can be done in the info window of the class. there is no need to change the extention.
    I hope this helps.

  5. Pingback: iOS Development Link Roundup: Part 1 | iOS/Web Developer's Life in Beta

  6. soooooo glad I came across this tutorial. I’m just learning to make apps on the iPhone specifically with audio tools and music based apps in mind. You have given me a great start. Thanks soo much!

    P.S. your app is faaaaaantastic. It has so much detail. Must have took you a long time to make. 感謝感謝!

  7. Hi Tim, I was wondering if you could help me. I am new to programming in Xcode and iOS but have experience with C and embedded programming. I was wondering if you could help me clear up a few problems I am having while trying to get my head round things. I understand everything that is going on in the first two parts of your tutorial with configuring the audiograph but im a little confused with render callbacks and how to hook up a buffer to store and access the audio passing through the unit. I myself am trying to set up the remoteio as input and output in order for me to insert my DSP code in between for audio processing. I have everything set up with the audio unit but cant seem to understand how I am supposed to access the input and output. Any help would be much appreciated.

    • To get the input you can call AudioUnitRender() on the input context of the remoteIO from within your output callback.
      You will also have to set the kAudioOutputUnitProperty_EnableIO property when setting up the graph. Apple’s sample code has an example of this type of pass-thru setup.

      • thanks for such a speedy reply. i just received it today. Will i have to create a buffer or can i access an existing one. I have noticed the buffer mbuffer[i] cropping up in apples documentation but im not sure if it has another function. Could you possibly explain the need for multiple buffers. would it not be just as simple to have a single large one rather than several smaller ones?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s