All posts by ddf

Spectral Harp Out Now For iOS

Image showing the Spectral Harp app.

Spectral Harp is now available on the iOS App Store for just $0.99!

Spectral Harp is a sound toy that generates sound by letting you strum or tap strings that represent portions of the audible spectrum. You can produce more variations in sound by using the four sliders along the bottom to control spectrum density, pitch, decay, and a bit crush effect. It’s great for creating things that sound like aliens or droids talking. Here’s a sound sample:

Digitally Signing A Unity Mac Build

These days, if you want people to be able to run your game on a Mac you’ve got to digitally sign the thing or else most users will see the dreaded Gatekeeper dialog that claims the game is damaged and should be moved to the trash. All it means to digitally sign your game is that you’re using Apple tools to embed your Developer ID in it so that Gatekeeper will trust the thing. My buddy Rusty has a really straight-forward post about how to do this and how to test it to make sure it works. Read his post first.

Ok, you’re back from reading it? Maybe you noticed he’s not talking about signing a Unity game, so I’m gonna fill in those details.

In place of Rusty’s Step 3, you’ll be building for Mac from Unity, but first you’ll want to run the Unity Entitlements Tool on your project. For distribution outside of the Mac App Store you only need to fill out the Code Sign section, you don’t need Entitlements or Sandboxing. One annoying requirement for the tool is that you need to provide an icns file. On Cosmic DJ I fished UnityPlayer.icns out of the version of the app that I’d built before setting up the tool, though there are ways to create your own icns file. Here’s what the Unity Entitlements Tool looks like for Cosmic DJ:

For distribution outside of the Mac App Store you need to make sure you use Developer ID Application as the signing entity. When you hit the Update Build Pipeline button it will generate an entitlements file in your project and either create a PostprocessBuildPlayer script or append some code to the existing one. What this script does is explicitly sign all of the Unity framework DLLs and all of your Plugins.

When the build completes you still need to sign the .app itself, as in Rusty’s post. It will ask if you want to overwrite the existing signature and you should say yes. You can verify that it worked properly like this:

And it should say:

And then follow the “quarantine” steps: upload to a server or Google Drive or something, download to your computer, attempt to run.

Building a Windows Plugin for Unity: Lessons Learned

Here’s some fun trivia about Cosmic DJ: the way that we achieve our rock-solid musical timing and responsive tap input is by using an external audio engine that I wrote. Generally speaking I would not recommend doing this! However, it’s just not really possible to have perfect metronomic time if you are running your metronome in the main game loop. Fluctuations in frame rate mean fluctuations in the smallest timestep available to the metronome, so if a frame takes longer than a 16th note you’ll hear the rhythm get a little wonky! Anyway, that’s a separate discussion, I’m going to talk about all of the things I got caught on while trying to build this non-trivial plugin.

Develop in Visual Studio 2008

A major thing I got stuck on for a while was trying to build the plugin with Visual Studio 2010. It seemed reasonable enough to do so, but it turns out you must build the plugin against .NET 2.0, which was not something I was even able to select in VS2010. I attempted to install older versions of .NET to build against by searching around the Microsoft download site, but only wound up with installers that didn’t work properly in Windows 7. Visual Studio 2008, meanwhile, defaulted to .NET 2.0, so I used that. Other build settings that seemed to matter a lot were:

  • General > Common Language Runtime support => No Common Language Runtime Support
  • General > Charater Set => Use Multi-Byte Character Set
  • C/C++ > Code Generation > Enable C++ Exceptions => Yes (/EHsc)
  • C/C++ > Code Generation > Runtime Libarary => Multi-threaded (/MT)

Unity has a really simple example plugin in their documentation that I’d recommend simply copying all of the settings out of. It’s called Simplest Plugin and is linked under the Examples section of Build Plugins For Desktop Platforms in the Unity Manual. I suggest looking for this page in your local documentation, just to be safe.

Create A Single Project Solution

There are a couple other libraries included in the plugin that I wanted to statically link to, such as my own partial port of Minim to C++ (YMMV), and like you do I wanted to have a solution that included the vcproj for Minim as a sub-project. This turned out to be mostly problematic because of the very particular build settings required by Unity. I found it difficult to make sure all of my projects agreed in their settings and also linked together properly.

Put DLL Dependencies In The Correct Place

Windows doesn’t have audio file reading routines built-in (like CoreAudio does in OSX for instance), so we leverage a couple open source libraries to make this happen (libsndfile and mpg123 if you must know). Both of these libraries are distributed such that you link against a .lib, but the real code lives in a DLL that is loaded at runtime. Unity can handle this but not quite in the way you’d expect.

To work in the Editor: Any DLLs that your plugin depends on must be put in the top level of your Unity Project folder, not in Assets/Plugins as you might expect.

To work in the Standalone Player: Any DLLs that your plugin depends on must be put in the same folder as the exe file, which either means copying them by hand or writing a post-process script to copy them.

You Can Debug Your DLL Using Visual Studio

A cool thing: it’s possible to debug your plugin code in Visual Studio while the DLL runs in the Unity Editor. For your debug flavor you’ll want Runtime Library set to Multi-threaded debug (/MTd). With your project open in Visual Studio: start the Unity Editor, go to the Tools menu in Visual Studio and choose “Attach To Process…”, choose Unity.exe from the list (Visual Studio might hang out for a bit before becoming responsive again), set some breakpoints, and press Play on a scene in Unity that has a script that calls into your plugin. When you call into code that has a breakpoint, VS will catch it and you can inspect variables and do like you would normally do. Important: Unity won’t load your plugin until the first time a script calls out to it, so when you initially attach to the Unity process it’ll look like your breakpoints are inaccessible, which is true because the DLL hasn’t been pulled into the executable yet.

Use String Marshaling When Returning Strings

Behavior seems inconsistent across platforms with regard to how Mono deals with strings returned from plugin functions. In OSX it seems to work fine to return static strings, but in iOS and Windows this causes a problem when Mono tries to deallocate that memory. If you search around the web you’ll see that some folks recommend allocating a new string and returning that, so that everything is cool when Mono tries to deallocate it, but I was still getting crashes on Windows with that method. Instead, what you want to do is return a const char * from your plugin code and then have this in the C# binding:

Importantly, PtrToStringAnsi copies the string represented by the IntPtr and also expands characters to unicode, so now you’ve got a string that is made from managed memory and will not cause crashes when it goes away. Of course this means you definitely do not want to allocate a new string when you return one to Unity because that memory will be leaked.

Cosmic DJ Released In Special Humble Bundle!

Wow! So Excite! The music game I’ve been working on for the past two years with my buds Eric and Matt over at GL33k is now available for Windows and Mac through the Devolver Digital Double Debut Humble Bundle! This is pay-what-you-want bundle of games and movies. You can also choose to put some of your payment towards Brandon Boyer’s Cancer Treatment Relief Fund. He’s a super awesome dude that has been instrumental in bringing indie games the attention and respect they deserve and has been straight up stiffed by his insurance company because they are invoking a super dicey pre-existing condition decision.

What’s Cosmic DJ, you may be wondering? It’s a crazy cool music game where you can sequence up to five instruments over backing tracks and then the game will remix all of your sequences into a song you can share with your friends. In other words, it’s a super fun way for non-musicians and musicians alike to make music! Here, maybe watching our trailer will explain it better:

)

You can also make cover art for your songs right in the game while you wait for the song to be recorded. Here’s a track I made recently:

And there’s more to come: we will be releasing Cosmic DJ for iOS in the near future and hopefully bringing the game to Steam as well. I’ve also got a couple technical blog posts lined up about some of the stickier issues we encountered while preparing the desktop release.

You can keep tabs on Cosmic DJ through Facebook and Twitter. Tell your friends!

Minim 2.2.0 Released

The day has arrived. It has (for me) taken an excruciatingly long time to arrive, but here it is.

I have finally released a new version of Minim.

It is essentially not much different from the 2.1.0 Beta release that many of you are familiar with and which has been included with Processing for about a year. But the documentation is now in a state that I don’t feel totally embarrassed about and that will be easier to maintain moving forward. The Minim Manual is no more (and I apologize for all the links out there on the web that will break as a result), but it just doesn’t make sense to try to maintain two sets of documentation. The point of the Manual was to give new programmers documentation that was more approachable than bare Javadocs and I think the new documentation site, generated with my hacked version of the popular proDOC, accomplishes that goal. I’ve also kept and updated the Quickstart Guide for those people who really don’t want to read very much before getting their hands wet.

I’ve initiated a pull request with the Processing team on Github, so I think you can expect to see this version of Minim included with the next release of Processing, though I can’t say exactly when that will be.

Moving forward I hope to be able to have more regular releases with bug fixes and so forth. If you experience bugs or find a hole in the documentation that you really wish could be filled, please open an Issue on Github and let me know about it!

STARPHONIX:
A #GAMEMAKINGFRENZY Game

I just spent basically my entire weekend working on a game for #GAMEMAKINGFRENZY, the Fantastic Arcade 48-hour game jam. The theme was Intergalactic Fantastic with the additional restriction of No Humans Allowed. I collaborated with George Royer from White Whale Games to create a game about the STARPHONIX series probes, which were sent out to negotiate peace treaties with alien probes to prevent the destruction of planets of interest.

You can play it in your browser, download it for Windows, or download it for Mac.

Portraying The Terran Condition: An Approach To Simulate A Civilization

I am currently the artist-in-residence for monochrom at the Museumsquartier in Vienna, Austria. Gotta say, hanging out in an old European city thinking about art is pretty sweet. As part of the residency, I decided to participate in Fuck This Jam, which is a week-long game jam with the theme of “making a game in a genre you hate.” So, in collaboration with Johannes Grenzfurthner from monochrom, I created a First Person Research Shooter called Portraying The Terran Condition: An Approach To Simulate A Civilization.

Portraying The Terran Condition is a 7D (backwards compatible to 2D) world simulation which depicts six different key events in the history of Terra (“Earth”), a low-tech civilization that self-destructed several aeons ago. Based on the relatively few biological and cultural artifacts, a team of multi-AI minds was able to recreate a stunningly accurate depiction of this ancient civilization.

The game is available for Windows and Mac.

Additional thanks are due to Eric Wenske, my teammate from Cosmic DJ, who made a couple models, and to Heather Kelley, fellow Kokoromi member, whose wise words led me to do procedural level generation for the game. So, I guess it’s also a rogue-like!

Controlling a Meeblip from Reaper with a JS effect

I recently purchased a Meeblip because I wanted a project to work on with my maker friends that I get together with every week. I decided to buy the full build-it-yourself kit, which meant I was in for a fair amount of soldering and so forth. Anyway, that all finally came together and I plugged it into my computer to see if I could control it with a MIDI track in Reaper. Yup! You sure can!

However, what you can’t do is create an envelope for any old MIDI CC message. You can put CC messages into the MIDI clips themselves, but that’s not very conducive to doing the kind of automation I’m used to doing with bass synth VSTs. It turns out that Reaper comes with a JS effect called MIDI_CCRider, which allows you to put an LFO on any CC message you want. I was able to look up which CC message was filter cutoff for the Meeblip and automate it using that effect. But this was not great because it meant that I’d need an instance of the effect for every parameter I wanted to control and that I’d have to constantly remind myself which CC messages controlled which parameters of the synth. So, instead, I decided to use that effect as the basis for a new JS effect that would have all that information baked into it and simply give me a bunch of sliders and combo boxes labeled the same as on the Meeblip. And voila:

The effect only sends a CC message when the value of a slider changes. This means that you can use it to set some initial values and then play with the knobs on the Meeblip while you loop a bassline. Only parameters of the effect that you choose to automate with an envelope will be constantly overridden. If you’d like to try it out with your Meeblip, you can grab it from Github: https://gist.github.com/3606021

Overhead spikes in the Unity3D profiler

I’m currently using Unity3D to build an iOS app and the other day I decided to do some profiling, just to make sure nothing is taking up crazy amounts of time. What I saw was pretty strange:

Now, if you Google the title of this post, you’ll find quite a few people who are seeing spikes in the Unity3D profiler for something called Overhead and if you follow those links you’ll find that for the most part people aren’t incredibly certain how to make them go away. The Unity3D docs don’t really explain what Overhead represents, but you can see it in the very first image on the page. Some people on the internet seem to think that Overhead includes garbage collection, but I’m pretty good about not creating tons of new objects every frame and others point out GC.Collect is what will show up for that. Also, if you look closely at the legend in the image you’ll see GarbageCollector has its own color.

No, it turned out that the culprit was using UnitySendMessage in a native plugin code! My app has a music system and in order to keep good metronomic time, I run the metronome with native code. This is important because if the metronome is only updated every frame, then you can wind up with wonky rhythm when there are framerate spikes. Lots of things are tied to the metronome in this app, so I needed a way to tell script about “ticks”, which simply means calling script on every sixteenth note. I used a line of code that looked like this:

This function tells the Unity runtime to find the object named “Metronome” in the scene and then call a function named “Broadcast” that takes a string. In my Metronome.cs script I then converted the string to a float and told all of the objects listening to the metronome that a tick just happened. I wrote it this way because the very first tip given for iOS plugins is “Managed-to-unmanaged calls are quite processor intensive on iOS. Try to avoid calling multiple native methods per frame.” So I figured having Metronome.cs call a native function every frame to retrieve ticks would probably be a bad idea, but it turns out it’s better than using UnitySendMessage.

I changed the way the metronome works so that when the native code gets to each tick it pushes it onto the back of a list, then in the Update function for my C# Metronome, I pass a float array to a native Metronome_GetTicks function, which then fills the array with all of the ticks in the tick list and returns how many ticks it put into the array. Then, for each tick in the array, I call the same Broadcast(float) method that I called from Broadcast(string). Now profiling the exact same level that gave me spikes looks like this:

Elephant in the Room

Yesterday was the first day of the Joue le Jeu – Play Along exhibition at the Gaîté Lyrique in Paris. This show has been curated by two of my Kokoromi cohorts, Heather Kelley and Cindy Poremba, along with Lynn Hughes from Concordia University in Montréal. These three amazing women have curated a show containing both “traditional” videos games, which can be played in the gallery space by visitors, as well as a number of commissioned pieces created especially for the show. One of these commissions is my piece for the Gaîté’s Chambre Sonore.

The Chambre Sonore is a room that was custom built for the Gaîté, though not specifically for my piece, that contains a 10 channel sound system (8 mounted in the walls around the room, 1 in the ceiling, and a sub behind one of the walls), 8 pressure sensitive floor panels, and 8 color LED lights that ring the ceiling. All of these can be interfaced with from a computer under the stairs next to the room.

Elephant in the Room Wall Text

High Concept

The high concept of the piece is that the room has a digital creature living in it and that visitors can interact with the creature by stepping on the different floor panels in the room. Over time this interaction with the creature changes how it sounds so that if someone visits it more than once in a day, or comes back several days later, it should sound noticeably different than their previous visit.

Implementation

The actual implementation of the piece differs slightly from the high concept, however. I had originally envisioned creating a single complex sound generating system that would have lots of different values that could be slowly changed by stepping on the various floor panels. In this way, it would have been a very slowly evolving soundscape, without a readily discernible way for visitors to tell that they were making a difference by being in the room. Ultimately, I decided that it would make more sense within the context of the exhibition as a whole if visitors were able to easily figure out that the piece was reacting to them.

So, the piece contains five different modes that it might be in when a visitor enters and by paying close attention to the feedback provided by the lights and sounds, they should be able to figure out the “puzzle” of the mode and then intentionally push the mode to its end, triggering a transition to the next mode. The modes have the very unpoetic names of Rainstorm, Pulsing Tones, Soothe, Chirps, and Tune Maker.

Hyrax footprints stuck to the floor of the Chambre Sonore to indicate where visitors can stand to interact with the piece.

Each mode is a very different soundscape, helping to meet the goal of visitors hearing something different if they visit the room a second time. Rainstorm sounds like digital rain with a whooshing wind and if visitors stand in the correct place, they will anger a creature that will roar at them. Pulsing Tones is a large pulsing chord that visitors can turn into high-pitched static by increasing the amplitude modulation frequency applied to each note in the chord. Soothe is a gentle environment with a low frequency noise wash, long glassy tones taken from a harmonic series, and a soft heartbeat emanating from the wall. Visitors can create marimba-like tones by stepping on floor panels. Chirps emits periodic descending or ascending tones that are patched through a modulated delay effect. Visitors can use floor panels to increase and decrease the duration of the tones, or to increase and decrease the speed of the delay modulation. Tune Maker is a generative dance tune that visitors can add musical elements to simply by dancing around the floor. If they dance fast enough they are rewarded with a prerecorded dance tune that has choreographed lighting. Here’s a short video of what Rainstorm sounds and looks like.



Technology

As mentioned, there are three systems in the room that the computer interfaces with: 10 channel audio, 8 pressure sensitive floor panels, and 8 color LED lights. The audio runs through an audio card installed in the computer, out to two amplifiers. The floor panels send MIDI messages and are plugged into a USB MIDI interface. The lights are controlled using DMX and I chose to use the ENTTEC DMX USB PRO to control them. I was able to interface with all of this hardware using Processing.

Visual proof that Elephant in the Room was built with Processing.

For the audio, I used Minim, my sound library. Almost all of the sound is generated in real-time using the UGen framework. However, to handle the 10 channels, I had to bypass using an AudioOutput object and write a custom class that could deal with the sound card on the computer. JavaSound gives access to audio devices through the Mixer class. Typically, for a multi-channel sound card, the outputs will be made available as stereo pairs. So, there’s a Mixer for channels 1/2, 3/4, 5/6, and so on. Once you have a Mixer, you can ask it for a SourceDataLine that is used to write out audio data, but if a Mixer represents a stereo pair, you’re never going to be able to ask for a 10 channel SourceDataLine. In fact, even though two of the Mixers listed for the sound card were named 5.1 and 7.1, I was not able to ask those for 6 and 8 channel outputs. Instead, I gather up all the stereo pairs representing channels 1 through 10 and ask each of them for a stereo output. Then I generate 10 channels of audio from my root UGen and write out two channels of audio to each stereo output in turn. This lets me think about generating audio in 10 channel terms and makes it easy to write UGens that can pan sound around the room, send sound to just one speaker, or expand a stereo signal across several speakers.

Flashing disco lights during Tune Maker.

For the lights, I used a library for Processing called dmxP512, which made dealing with the lights super easy. At first, I thought I would have to use MIDI and a light cue building program that had been used with previous installations for the Chambre Sonore, but the direct control that dmxP512 gave me was much better. I wound up writing a wrapper class for it that let me set colors using a color variable in Processing and also let me fade colors over time. A very useful idea I came up with was to allow for animating color and intensity separately. So I can set a light to color(255,15,23), for instance, but then set the intensity to 0.2 and wind up with that color at 20% brightness. I simply multiply the RGB components by the intensity before sending them to the lights and this allows me to do stuff like blink a light by animating the intensity while at the same time slowly shifting the color from blue to pink.

The floor communicates with the computer by sending MIDI messages and I was able to receive these by writing some pretty straightforward JavaSound MIDI code. Basically, you just ask for the Transmitter object of the MIDI device that the floor is plugged into, write a class that implements the Receiver interface, and set an instance of that class as the Receiver for the Transmitter from the MIDI device. So I wrote a FloorReceiver class that I can add a FloorListener to so that the MidiMessage parsing can be in one place and other code can simply receive notifications when a floor panel goes down or up. I believe the floor is made by a French company called Interface-Z and it appears to send only one kind of MIDI message: a NOTE_ON message where the note number indicates which pad sent the message and the velocity indicates whether the panel went down (64) or up (0). I did find the floor to be a little bit frustrating to work with because some panels respond in only a small area and others a comparatively large one. This is the reason for the footprint stickers: to show people where they can stand to best interact with the room and also which direction to face so they will hopefully see a connection between where they are standing and the light that is blinking at them.

The Chambre Sonore lit with cool cyan lights.

Reception

From what I have seen, most people enjoy interacting with the room and seem to understand that they are having some kind of effect on the sound. But aside from several people I know personally who really made an effort to figure out the room, I haven’t seen anyone actually clue in to how to get a mode to an end point. Certainly, part of this is due to the fact that nothing explicitly states that this is possible, but I think it also has to do with the fact that the only feedback is from lights and sound. Each floor panel is associated with a particular light and speaker, and this relationship is maintained throughout all the modes, but unfortunately the layout of the room doesn’t physically reinforce the relationship. This isn’t a problem per se, since the piece is meant to work as simply an interactive ambient soundscape, and the piece has been switching between modes despite people not quite getting it. I had hoped to do more observation of the general public interacting with the room, but if I’m in the room, it sort of changes the experience for people, so I’ve mostly stayed out of sight. All in all, I’m pretty happy with how it’s going so far.

Elephant in the Room shows at the Gaîté Lyrique in Paris as part of Joue le Jeu – Play Along until August 12th, 2012.