Making a sound-activated iPhone camera app
Apple’s camera interface sucks. It sucks for the user, who has to hit a button on the screen (thereby shaking the camera and virtually guaranteeing a blurry result), and it sucks for the developer, who is forced to use a modal UIImagePicker view to access the phone’s camera.
For my iPhone development class, I wanted to create a minimal sound-activated camera interface. Since my knowledge of Objective-C is pretty rudimentary, I broke the task up into a series of subtasks:
- Accessing the camera without using UIImagePicker:
- Metering and displaying sound levels:
- Creating an intuitive way to set the sound threshold:
- Having a countdown timer option:
- Creating an automatic email feature:
Scouring the web for information on circumventing Apple’s camera interface led me to discover the approach that many jailbroken apps use: accessing the camera’s hidden classes through a private framework and then saving the pixels of the preview as a UIImage. After an aborted attempt to create a toolchain, I gave up and eventually found Norio Nomura’s extremely helpful Github repository that includes a class called CameraTest, which provides three different ways of capturing photos by invoking classes from the private framework dynamically at runtime—that means no toolchain and no weird compilation requirements.
I played with SpeakHere, Apple’s example app that does just this and almost cried. It is the most horrendously complicated thing I’ve ever seen—OpenGL, eight classes, tons of C. I suspect that Stephen Celis, the creator of the extremely simple and helpfully documented SCListener class must have been inspired in part by a similar sense of despair. SCListener outputs the peak and the average levels on the microphone and requires nothing but the AudioToolbox framework. I linked the peak to an animated UISlider and presto, a sound meter.
One of the benefits of using a UISlider as the sound meter is that it is also a slider! When the user touches the slider, it stops monitoring the sound and responds to his finger. Wherever he releases it, that’s the new threshold. I still need to add some persistent feedback, possibly a small colored bar or other subtle indication of the threshold’s current value. Even without it, it works pretty well.
For hands-free operation or self-portraits. This was a trivial matter of setting an NSTimer. The most difficult thing was figuring out how to create a tab bar button that would change its image with every user touch.
This seemed like it was going to require a complicated notification system and PHP scripts on a web server but Apple has opened up apps’ access to email in the forthcoming 3.0 SDK, so this feature is on hold until then.