Previous: Setting up the AUGraph.
In the previous two posts, we set up the project and hooked up the audio plumbing. Now we finally get to the actual work. Amazingly, all there’s just one function left to write. The audio render callback is the function you provided earlier to be the input to the mixer AudioUnit. Whenever the Mixer needs new audio input, it calls the render call back and it is up to you to fill a buffer up with audio samples. It is a C function that has this specific set of parameters.
- inRefCon – A pointer to an object that is used to pass in parameters.
- AudioUnitRenderActionFlags – Indicates special states, we won’t need it here.
- AudioTimeStamp – Used if you need to synchronize multiple sources.
- inBusNumber – The specific bus of the Audio Unit that is called the function.
- inNumberFrames – The number of frames of sample data that will be passed in.
- ioData – An AudioBufferList, which is a struct containing an array of buffers representing sample data and a count of those buffers.
Here’s what we’re going to be doing.
- Getting a pointer “THIS” so we can access AudioController variables.
- Getting a pointer to the buffer we want to write to. (here there is only one buffer, it will be at index ).
- Performing some preliminary setup for the sine wave.
- Looping through an inNumberFrames length loop, calculating a sine wave and writing sample values to the buffer.
- Saving any AudioController variables that need to be remembered across calls to the render function.
Previous: iPhone Core Audio – Getting Started
In the previous section we went over the tedious details of getting a project set up. Now we are almost ready to start exploring some code. But first, some core audio concepts.
- Audio Unit – A collection of audio processing code that is bundled up for reuse in many applications. This is the same concept as a VST or AudioSuite plug-in. On OS X there is a good sized set of built in Audio Units that do common tasks like filtering or reverb and you can install 3rd party AU plug-ins. On iPhone OS the built-in set is more limited. There is not (yet) any way to load 3rd party AudioUnits.
- AUGraph – A set of functions for loading, managing, and connecting AudioUnits into a signal processing system.
- remoteIO – An audio unit that connects to the iPhone’s audio input and output hardware.
- AudioComponentDescription – A struct used to identify an AudioUnit.
- AudioStreamBasicDescription – A struct that holds information about constant bitrate audio that will be transferred; either to a file, through audio units, or out to audio hardware.
- CAStreamBasicDescription – A wrapper around the AudioStreamBasicDescription struct that provides some convenience methods for modifying values and printing.
Inside of the AudioController class we will set up an AUGraph containing a remoteIO audio unit and a multi-channel mixer audio unit. In your AudioController.h file you will need to:
- Import the AudioToolbox
- Import CAStreamBasicDescription.h
- Create instance variables to represent an AUGraph and a Mixer audio unit.
- Add methods to initialize, start and stop the AUGraph.
After I’ve been I working on a project for a while, I tend to tuck away and forget about the low level plumbing details I needed to get the project off the ground. Consequently, when I want to quickly try out a new idea in a new project, I end up wasting time remembering all of the i’s I needed to dot and t’s I needed to cross. So, as I am at the beginning of another project, I thought I would make a checklist for myself and share it with you along with a little explanation of why each step is necessary. Hopefully it is useful for some of you out there as a “Hello World” tutorial of sorts.
Core Audio is complicated, there’s no getting around it. It is capable of some amazing things, but getting a simple project started requires synthesizing a large and disparate set of concepts, documentation, mailing lists, and code samples. While Apple’s documentation is light years better now than it was back in the Panther/Tiger era, it still doesn’t do a great job of getting someone started. These are the steps I use to build an iPhone project that uses the remoteIO AudioUnit. RemoteIO is the system to use if you need low latency audio input/output for tasks such as a synthesizer app. If you just need to play a few sound effects, you should check out SystemAudioServices or OpenAL. If want to stream large media files you should check out AudioQueueServices.
There are 3 basic tasks that need to be addressed to get live audio going.
- Setting up the XCode project so that it has all of the necessary files and libraries. (Appeasing the #include Gods.)
- Initializing and connecting various audio objects. (Setting up the plumbing.)
- Creating an audio callback function. (Where your work gets done.)
This project will perform the simplest audio task possible, play a solid 600 Hz sine wave.
VocaForm 1.4 demonstration of audio effects, key selection, landscape keyboard, and envelope control.
Combining cooking and iPhone apps is right up my alley, so I decided to make this cake for this year’s Great App Bake Off. Transforming the VocaForm mascot from an icon into edible “artwork.”
I started by baking a Coconut cake from this recipe and whipping up some buttercream frosting from this recipe, substituting coconut milk for the regular milk. I made the frosting green using Matcha, a green tea powder.
[Edit] – The contest has ended but check out the Best App Ever Award Promo Code Giveaway, featuring free downloads from VocaForm and dozens of other contest nominees. [/Edit]
We’re coming down to the end of the nomination period in the 148Apps Best App Ever Contest. Help me out by giving me a nomination. There’s a lot of categories over there but I think VocaForm fits best in the Best Musical Synthesizer area.
VocaForm stands out in the crowd of instrument apps with distinctive, real-time synthesized sounds, great digital audio effects, and an interface that is designed for musical expression. It makes a great addition to the arsenal of any iPhone carrying musician. For more info, check out the VocaForm page.
Dan Grigsby at Mobile Orchard is helping to promote the 148Apps Best App Ever Contest. Dan’s Mobile Orchard podcast taught me everything I know about promoting an iPhone application. I highly recommend the podcast and the site for anyone who needs to figure out how to balance coding and marketing.
Nine Inch Nails’ “Hurt” played on my iPhone instrument VocaForm and acoustic guitar. There is one guitar track, everything else is played on an iPod Touch.
Audio recorded with a PreSonus Firepod and Garageband. Video recorded with a Canon T1i and a Panasonic Lumix DMC-TZ5. Edited in Final Cut Pro.
More info: http://timbolstad.wordpress.com/vocaform
iTunes Link: http://itunes.com/apps/VocaForm