Category Archives: iOS

Spectral Harp Out Now For iOS

Image showing the Spectral Harp app.

Spectral Harp is now available on the iOS App Store for just $0.99!

Spectral Harp is a sound toy that generates sound by letting you strum or tap strings that represent portions of the audible spectrum. You can produce more variations in sound by using the four sliders along the bottom to control spectrum density, pitch, decay, and a bit crush effect. It’s great for creating things that sound like aliens or droids talking. Here’s a sound sample:

Overhead spikes in the Unity3D profiler

I’m currently using Unity3D to build an iOS app and the other day I decided to do some profiling, just to make sure nothing is taking up crazy amounts of time. What I saw was pretty strange:

Now, if you Google the title of this post, you’ll find quite a few people who are seeing spikes in the Unity3D profiler for something called Overhead and if you follow those links you’ll find that for the most part people aren’t incredibly certain how to make them go away. The Unity3D docs don’t really explain what Overhead represents, but you can see it in the very first image on the page. Some people on the internet seem to think that Overhead includes garbage collection, but I’m pretty good about not creating tons of new objects every frame and others point out GC.Collect is what will show up for that. Also, if you look closely at the legend in the image you’ll see GarbageCollector has its own color.

No, it turned out that the culprit was using UnitySendMessage in a native plugin code! My app has a music system and in order to keep good metronomic time, I run the metronome with native code. This is important because if the metronome is only updated every frame, then you can wind up with wonky rhythm when there are framerate spikes. Lots of things are tied to the metronome in this app, so I needed a way to tell script about “ticks”, which simply means calling script on every sixteenth note. I used a line of code that looked like this:

This function tells the Unity runtime to find the object named “Metronome” in the scene and then call a function named “Broadcast” that takes a string. In my Metronome.cs script I then converted the string to a float and told all of the objects listening to the metronome that a tick just happened. I wrote it this way because the very first tip given for iOS plugins is “Managed-to-unmanaged calls are quite processor intensive on iOS. Try to avoid calling multiple native methods per frame.” So I figured having Metronome.cs call a native function every frame to retrieve ticks would probably be a bad idea, but it turns out it’s better than using UnitySendMessage.

I changed the way the metronome works so that when the native code gets to each tick it pushes it onto the back of a list, then in the Update function for my C# Metronome, I pass a float array to a native Metronome_GetTicks function, which then fills the array with all of the ticks in the tick list and returns how many ticks it put into the array. Then, for each tick in the array, I call the same Broadcast(float) method that I called from Broadcast(string). Now profiling the exact same level that gave me spikes looks like this:

Auto-rotation in iOS with openFrameworks

I don’t think that I’ve mentioned this before, but WaveShaper is a mixture of openFrameworks, UIKit, and Minim for C++. One of the major difficulties I’ve had in developing the app is getting the UIKit views and controllers to play nice. It’s not hard to render UIKit widgets on top of the EAGLView, you simply add your UIKit widget as a subview of the UIWindow or the EAGLView that openFrameworks creates, but it does not work to directly use openFrameworks coordinates to set the position and size of UIKit widgets.

The reason it doesn’t work is because (I’m pretty sure) openGL’s coordinate system is not the same as Apple’s. Part of what openFrameworks does is translate the coordinates of incoming touches from Apple’s coordinate system to openGL’s before sending touch events. This allows you to always think about the drawing surface in the “natural” way for doing openGL apps: upper left corner is always (0,0), positive X is to the right, positive Y is down. You can use this knowledge to simply transform your openGL coordinates to Apple coordinates when setting the position and size of UIKit widgets, but if you are using ofxiPhoneSetOrientation to reorient your openGL surface, the mapping between openGL coordinates and UIKit coordinates changes depending on the orientation. This means you have to reposition all of your UIKit objects every time the device is reoriented. Even worse, what I discovered is that in one of the landscape modes, even though the UIKit widgets looked correct, I actually had the device orientation set to the opposite of what it should have been, so when I displayed popovers, they were upside-down. When I presented modal view controllers, they would appear in the correct orientation, but then not rotate when device orientation changed.

After much hair-pulling, I finally came up with a solution that I think is pretty solid. Essentially, what I have done is embed the EAGLView in a view hierarchy that is well behaved with respect to Apple’s system and I do not call ofxiPhoneSetOrientation to change the orientation of the app. What this means is that from openFrameworks’ perspective the app never changes orientation, which is true in some sense because the EAGLView is added as a subview to a vanilla UIView that is strictly responsible for handling the rotation.

Ok, onto some actual code. Before setting up any UI, I set the openFrameworks orientation to landscape:

Then I rearrange the view hierarchy that openFrameworks creates so that I can set the position and size of UIKit widgets using openFrameworks coordinates. Included in this is creating a root view controller for the window, so that I can take advantage of the auto-rotation behavior they provide (and because Apple expects your app to have a root view controller):

Here’s the LandscapeViewController class, which limits supported orientations to landscape orientations and also turns off animations during rotation so that the orientation snaps. If you like the animated orientation change, you can simply remove those lines.

Finally, the class that registers to receive ofxiPhoneAlerts needs to handle device orientation changes like this:

And again, WaveShaper only displays in landscape, which is why those are the only orientations I handle.

Once all of this is in place, you can happily position buttons, sliders, popovers, and whatever else using the openFrameworks coordinate system and they will all look correct and rotate with the screen.

WaveShaper for iPad Released!

Today I released my first sound making app for iPad!

WaveShaper is an app for iPad that allows you to load up any audio file and make crazy cool sounds with it. It’s a lot like record scratching, but totally maxed out. This video will do a much better job of explaining it than words ever will:

Visit waveshaperapp.com for more info, sound samples, and screenshots. Or just go buy it right now!