Category Archives: Documentation

Auto-rotation in iOS with openFrameworks

I don’t think that I’ve mentioned this before, but WaveShaper is a mixture of openFrameworks, UIKit, and Minim for C++. One of the major difficulties I’ve had in developing the app is getting the UIKit views and controllers to play nice. It’s not hard to render UIKit widgets on top of the EAGLView, you simply add your UIKit widget as a subview of the UIWindow or the EAGLView that openFrameworks creates, but it does not work to directly use openFrameworks coordinates to set the position and size of UIKit widgets.

The reason it doesn’t work is because (I’m pretty sure) openGL’s coordinate system is not the same as Apple’s. Part of what openFrameworks does is translate the coordinates of incoming touches from Apple’s coordinate system to openGL’s before sending touch events. This allows you to always think about the drawing surface in the “natural” way for doing openGL apps: upper left corner is always (0,0), positive X is to the right, positive Y is down. You can use this knowledge to simply transform your openGL coordinates to Apple coordinates when setting the position and size of UIKit widgets, but if you are using ofxiPhoneSetOrientation to reorient your openGL surface, the mapping between openGL coordinates and UIKit coordinates changes depending on the orientation. This means you have to reposition all of your UIKit objects every time the device is reoriented. Even worse, what I discovered is that in one of the landscape modes, even though the UIKit widgets looked correct, I actually had the device orientation set to the opposite of what it should have been, so when I displayed popovers, they were upside-down. When I presented modal view controllers, they would appear in the correct orientation, but then not rotate when device orientation changed.

After much hair-pulling, I finally came up with a solution that I think is pretty solid. Essentially, what I have done is embed the EAGLView in a view hierarchy that is well behaved with respect to Apple’s system and I do not call ofxiPhoneSetOrientation to change the orientation of the app. What this means is that from openFrameworks’ perspective the app never changes orientation, which is true in some sense because the EAGLView is added as a subview to a vanilla UIView that is strictly responsible for handling the rotation.

Ok, onto some actual code. Before setting up any UI, I set the openFrameworks orientation to landscape:

Then I rearrange the view hierarchy that openFrameworks creates so that I can set the position and size of UIKit widgets using openFrameworks coordinates. Included in this is creating a root view controller for the window, so that I can take advantage of the auto-rotation behavior they provide (and because Apple expects your app to have a root view controller):

Here’s the LandscapeViewController class, which limits supported orientations to landscape orientations and also turns off animations during rotation so that the orientation snaps. If you like the animated orientation change, you can simply remove those lines.

Finally, the class that registers to receive ofxiPhoneAlerts needs to handle device orientation changes like this:

And again, WaveShaper only displays in landscape, which is why those are the only orientations I handle.

Once all of this is in place, you can happily position buttons, sliders, popovers, and whatever else using the openFrameworks coordinate system and they will all look correct and rotate with the screen.

Frequency Modulation Is Easy

I’m working on adding the ability to do frequency modulation in Minim. I didn’t figure it was a very complicated thing. I knew that I’d have to do it while generating each sample, so it couldn’t be done as an AudioEffect. But when I went looking online about how to do it, I found a number of articles that described the technique in words, with somewhat dense equations, but none that had easy to understand code examples. I did find this article, but the code in it is some form of Lisp, a language I am not familiar with. I thought I had it figured out a couple times, but every time I tried something new I still wasn’t achieving the correct result.

So today I asked my friend The Mysterious H how it works because he’s a smart guy and builds synths out of Atari sound chips and shit like that. He goes, “Oh it’s easy.”

Every time you want to generate a sample of the signal you are modulating (the carrier), generate a sample of your modulator and add that to the phase you use to generate your carrier sample.

Said another way: Say you’ve got a signal called f. At every time step t (which is your phase), you generate a sample s by evaluating f(t). If you want to modulate the frequency of f with a modulator called m, then at every t you will do this:

The speed of the modulation is determined by the frequency of your modulator and the amount of the modulation (how many hertz above and below the frequency of your carrier signal the audible signal will swing) is determined by the amplitude of the modulator. In my tests I found that the amplitude had to be around 0.001 to achieve what one would consider vibrato. Setting it to 0.1 made for some pretty intense alien sounds when the frequency of the modulator was up in audio signal range (20 Hz and up, say).

It’s lots of fun to play with and will be coming soon!

Minim Manual Updated.

Reas kicked me in the ass (via e-mail), so I updated the Minim Manual to reflect the Minim 2.0 API. I need to give a shout-out to Urban Giraffe for their WordPress plug-in Sniplets. This enabled me to pull all of the examples in the Manual directly from the online examples. This means that any time I update the online examples, they will be immediately reflected in the Manual. Rock, rock on, says I.

I’m pretty sure that I didn’t leave anything out, but if you spot any omissions, mistakes, or poorly written explanations, please let me know! I will endeavor to fix the problem as quickly as possible.

Also worth mentioning in this post is that Processing 1.0 is out! I can’t say I contributed much to its release, but its imminent release is what got me to finally cleanup and release Minim 2.0. One exciting detail I don’t think I’ve mentioned is that Minim is now one of the libraries that comes with the Processing download. I am very honored to have my library so closely tied to such an awesome project and I hope that people find working with Minim as easy as it is to work with the core Processing API.

Minim Manual Finished

Well, I accomplished one of my two goals and finished writing the Minim manual. Hopefully it is clear and will be a useful reference for people. I wasn’t able to push out a new release because of holiday parties, distractions from other projects, and so forth.

Some good news for Minim development, however, is that I finally got around to setting up my PC to dual boot Ubuntu and Windows XP, so I’ll be developing primarily in Linux and then testing things back in Windows. I have already discovered that Linux is pretty tetchy, has latency issues, and is just generally not as great for audio. I have an idea for how to solve the latency issue, but it will require pretty significant recoding. I may just release a new version of Minim within the next month to make some small API changes available and then push dealing with the latency issue into the following release. I’m going to try to be better about semi-regular releases so that when people send me bugs I deal with them quickly and then release a new version.