Archive for the 'Fall 2008' Category

« Previous Entries

Thinking About Toys

I’ve been thinking more about games than I have about toys recently, which I intend to remedy right now. Every Tuesday for the last seven weeks, I’ve sat in a room with twelve or so other people talking very seriously about what makes a good toy. I thought I remembered all my toys and knew for sure which were my favorites, but the conversation dredged up fond memories of toys I’d all but forgotten. My blue plastic Cinexin projector, for instance:

My other favorite toys included Magia Borrás, my Exin castle, my venerable STX 4X4 Scalextric, Lego Technic, Star Wars figures and vehicles, Transformers (especially Optimus Prime and a tank/plane triple changer whose name eludes me), and the contents of my toy bucket taken as a whole. I’m sure I’m forgetting something, but those are the ones I remember playing with the most.

And they all share at least one of the following characteristics that set them apart from sucky toys:

  • They put you in control
  • They gave you a grown-up ability
  • They allowed you to hide or disguise yourself
  • They lent themselves to the invention of stories
  • They caused something to happen or move
  • They impressed or surprised your friends, mom, or other adults

Which is why, I think, there aren’t that many new toys. Sure, Toys R Us in Times Square is brimming with an amazing assortment of toys, but most of them are just repackaged, rebranded, carefully gendered versions of a dozen or so archetypal toys: the doll, the science/discovery toy, the vehicle, the teddy bear, the puzzle, the art/creative material, the noisemaker, the building block, the board game, the bicycle, the costume, the ball, the tent, the weapon, the “learning” toy, the miniature [insert adult locale or situation], and the videogame.

Which is also why I stuck close to a traditional toy when I started designing.

Bugging out, now with sound!

In preparation for my return to school on Tuesday, I wanted to ease myself back into the swing of things by playing a little with Processing, so because I never got around to playing with sound last semester, I added some sound effects to my cockroach sketch using Minim.

Check them out here.

PComp Final: T.H.A.W. née Icy Hot

T.H.A.W.The path that led to my PComp final (and Winter Show submission) was circuitous as it was fortuitous, an of exercise in a kind of free association on which I rarely get to follow through.


Since wind featured prominently in my work this semester, I started out thinking I would extend the theme by rejigging my ICM midterm to work with a physical fan. Turn on the fan, aim it at an image on a screen, and the pixels are blown around in the direction you’re holding the fan and with a force proportional to how close the fan is to the screen. But when I was discussing the idea with Bridge, she said something along the lines of, “So when the person holds the hairdryer against the screen…”—hairdryer? Whoa.

I haven’t had any significant hair since before puberty, so hairdryers are outside the realm of my ordinary experience. I was envisioning mounting an infrared LED on the end of one of the blades of a little AA-powered handheld fan, the kind that comes in summer camp care packages, and using a camera to track it, determining the distance and the angle by the relative size and shape of the ellipse produced by the spinning light. A simple but elegant solution.

But a hairdryer! A hairdryer is a gun of feminist theory, it’s a pleasant pedal tone that harmonizes smoothly with even the most unpleasant of singing voices while erasing the sour notes, and it’s a sound-barrier that insulates the user from the encroachments of doorbells, dinner calls, and telephones. It boasts two dimensions a fan doesn’t have—noise and temperature. It wouldn’t do to ignore these by using a hairdryer just to blow pixels around a screen.

But how to take advantage of them? I had lots of ideas. Melting something onscreen (ice cubes perhaps?), blow drying virtual hair, a game in which opponents have to move a virtual feather along a screen using hairdryers, a hairdressers’ duel at dawn—most of these felt more like programming projects than explorations of physical computing. Ultimately, I was drawn to the hairdryer because it’s fun to hold, it’s noisy, and it’s very responsive. A good project would necessarily explore and combine each of these aspects.

I’d also been itching to play with MIDI since learning during midterms that it was nothing but a series of numbers sent at a specific baud rate. How about a hairdryer-powered musical interface? That seemed ripe with possibility.

Add a couple of in-class discussions, a really productive crit group show and tell, and several hallway conversations with my classmates, and so T.H.A.W. was born.


KORG NX5RMIDI is really fun and surprisingly easy to implement. Basically, it’s a control bit followed by one or two bits that set parameters. For instance, if you want a MIDI device, say a synthesizer like the Korg NX5R I used, to play a note, you send it a bit that tells it what channel to play the note on, another that specifies which note, and a third that determines the volume. Depending on the device, there are also control bits that allow you to choose a sound bank, change instrument, vary the pitch, add effects, and even control esoteric parameters such as attack and decay time.

To get started with MIDI, I read up on the specification here (the tables at the bottom of the page were especially helpful) and then built the hardware interface for the Arduino based on the instructions here and here.

I ran into one problem which I more skirted than addressed. If you turn a note on and off every loop, it doesn’t produce a smooth sound. So I reprogrammed the sounds I was using on the synth to not decay and only turned them on only once, on the first time through the loop. Then I sent control changes linked to the sensor values
to vary the volume. Obviously, this is only possible if you’re using sounds with infinite sustain that don’t vary too much over time. The code looks something like this:

note1 = map(thermValue0,0,4094,0,127);

  if (note1 > threshold) {
  if (startup1) {
    noteOn(0xB0, 0x78, 0x00); //all sound off channel 0
    noteOn(0xB0, 0x00, 0x51); //bank select: programA
    noteMod(0xC0, 19); //select sound 20
    noteOn(0x90, 60,80); //turn the note on at middle C and volume 80 on channel 0
    noteOn(0xB1, 0x78, 0x00); //all sound off channel 1
    noteOn(0xB1, 0x00, 0x00); //bank select: gma
    noteMod(0xC1, 125);// chose program 126
    noteOn(0x91, 70,80); // play the B above middle C at volume 80 on channel 1
    startup1 = false;
  noteOn(0xB0, 0x07, constrain(110-note1,0,127)); //change volume of the note on channel 0
  noteOn(0xB1, 0x07, constrain(note1-20,0,127)); //change volume of the note on channel 1

Here was one of my early experiments, controlling the modulation of a note using a potentiometer:


Actually constructing T.H.A.W. was the most fun part of the entire process. Once I got the MIDI circuit working, all that was left to do was to replicate it five times to create five different inputs and sound pairs, set up an LED driver to run five RGB LEDs (which have three cathodes and one anode each and thus require more PWM outputs than the Arduino has built in), and decide what the whole thing should look like.

The matter of T.H.A.W.’s appearance was resolved serendipitiously. I was debating taking the subway to school on a nasty rainy Wednesday but decided to walk along Houston and get wet. At the corner of Mott, I saw a piece of enameled black metal—maybe an old shelf—atop a pile of garbage. I went over to inspect and discovered it was a piece of aluminum that would be perfect as the base for my project.

T.H.A.W. Cube

LED and Thermistor

RedFlash2At this point, I wasn’t quite sure at what I’d be aiming the hairdryer, I just knew it would have an LED embedded in it and that it would look great glowing on this shiny black enamel surface. A couple of days later, I went down to Canal Plastics and found these little acrylic ice cubes which screamed, “Stick an LED in me and melt me with a hairdryer!” which is exactly what I did.

For the LED driver, I used a Texas Instruments TLC5940, which has an Arduino library written for it. I spent four hours pulling out what little hair I have trying to get it to work before I realized two things:

  1. RGB LEDs can be either common cathode or common anode. Pay attention! That means that in the first case, you’ll need to supply the LED’s red, green, and blue pins with its own PWM’ed ground (remember, keep the voltage at each ground relative to the voltage coming in the cathode) or, in the second case, you’ll need to supply each pin with its own PWM’ed power.
  2. Any pinMode declarations of Arduino input pins not associated with the TLC5490 need to come before the TLC.init().

And that’s it. I played with the hairdryer and the thermistors to work out a good timing for the heating and cooling, and in order to keep users from heating up the thermistors to the point where it would take them several minutes to cool down, I max out LEDs’ red value early so that after only five or six seconds of hairdrying their color jumps from a kind of pink directly to bright red with a flicker intended to suggest to the user that it would be a good time to move on to another cube.

Here’s the wiring:

T.H.A.W. Arduino

And the construction:

T.H.A.W. Underside

And the final result, sans sound:

Shoots and Leaves

The online version of Shoots and Leaves is here. It uses the mouse and keyboard instead of a Wiimote.

BranchesFor my ICM final, I teamed up with the lovely and talented Michelle Mayer, and together, we set out to create something beautiful out of something repulsive. My initial idea to cause red flowers to explode in a messy splatter all over a screen if you aimed a gun at your own head while standing in front of said screen. That seemed derivative (that t-shirt of Itamar’s with the birds flying out of the guy’s head) and more of a PComp problem, so we restated the challenge as creating life out of death.

The idea became to create an algorithmic seed that once planted, would grow on its own in an unpredictable and unique manner. We planned to project onto a mannequin wearing a white shirt and shoot at it with a Wiimote. But instead of drawing blood, our shots would draw flowering vines. Not to get too meta-geeky here, but the idea and its visualization are more than a little reminiscent of Project Genesis in Star Trek 2: The Wrath of Khan

When creating the effects for the scene, Industrial Light and Magic had to invent a bunch of graphic technologies. And good old retinal scanning, oh brave new world that has such technologies in it!

Anyway, we wanted to do something similar. Our first thought was to use input from a Wiimote to trigger a series of video elements we would have created ahead of time, but Processing’s limitations when dealing with large numbers of simultaneous video clips, the prospect of spending even more time in After Effects, and Dan Shiffman’s encouragement to seek out a programmatic solution convinced us that we might as well program life. Our initial proposal is here.

Creating Branches

The first task we tackled was branching. If our flowering vines were going to be at all realistic, they would have to unpredictably spawn other branches as they grew. This was a matter of setting a series of necessary preconditions for branching and a probability of its occurrence once those preconditions were met. Then all we had to do was pass a point on an existing branch as the origin for a new branch instance. That actually wasn’t so difficult, thought initially, all the existing branches would branch simultaneously and with ever increasing frequency until the whole screen filled up exponentially. Adding a variable in each branch to keep track of its lifetime and randomly changing it upon a successful branch solved that.

Curved Paths

The next challenge was getting the branches to move in nice curvilinear paths that were nonetheless irregular. We spent an entire day playing with sine functions but to no avail, our vines looped like drunken rollercoasters. The solution occurred to me right as I was going to bed one night. Taking my cue from Craig Kapp’s brilliant gravity simulator, I thought, why not have a bunch of balls that exert a force on the growing branches bouncing around invisibly in the background? Have three, say, and when a branch is born, assign it to follow one randomly. Make the force proportional to the square of the distance and it should yield random-looking curved paths. And it did!

Aging: A Perennial Problem

Next we decided we wanted our branches to thicken with age, as they would in real life. This we accomplished by storing each branch’s last fifty x and y positions and then drawing fifty successively larger semi-transparent ellipses at each. This creates the illusion of a thickening that follows the branch’s sprout. It also slows things down a fair amount because of the memory it requires.

Switch Cases and Flowers

Our final design step involved implementing flowers and leaves (and little twirly tendrils which in our multi-day programming orgy we never quite figured out). Since the basic conditions for branching are no different from the conditions for sprouting leaves or flowering, we implemented a switch case that favored leaves:

          if (b.check()){  // If a branch hasn't just branched
            if (b.branchcount < 20) {  // and it hasn't already branched more than 20 times
              int chance = round(random(0,3)); // pick one of the following cases randomly
              switch(chance) {
              case 0:
                b.branch();  // spawn a new branch
                b.lifetime = -b.branchcount * 200;  // delays the time its going to take for the next new sprout from the same branch
              case 1:
                b.lifetime = -b.branchcount * 200;
              case 2:  // two leaf cases to ensure more leaves than flowers and branches
              case 3:
            else {


We addressed several interesting smaller problems (rotating the leaves using the arctangent function to ensure that they grew according to the direction in which the branch was moving, implementing Wiimote control, and the eventually discarded use of real images of leaves and flowers instead of programmatically drawn ones) in Encroachment, which is documented here.

Here's Michelle showing the project on a wall at ITP:

The Final Straw

For my last Commlab assignment, I abandoned Happy and After Effects and returned to stop motion, which was my favorite technique among all the ones we played with this semester.  There is something so satisfying about taking a format with which I’m very comfortable (nice static Photoshopable photographs) and transmogrifying it by virtue of nothing other than repetition  into animation, a format that until recently provoked cold sweats.


I wanted also to revisit this photograph, which was part of an invitation that I made for the party Bridge and I threw to announce our arrival in New York (that sounds so grand, but that we threw a party doesn’t mean that anyone actually caught it).

Bridge and I discussed the idea and she thought it would be fun to film a discombobulated argument.  I had recently listened to the White Stripes song “There’s No Home for You Here” which I thought would make a great soundtrack.

So we set up the tripod and a bunch of lights in the apartment and separately photographed our eyes and our mouths as the song played.  I stitched the resulting photos into a four-frame collage in Final Cut.  I’m not all that happy with it.  It’s too slow; I should have taken about three times as many photos, though I’m not entirely convinced that the whole thing shouldn’t just be done in video to begin with. And syncing the sound was a total nightmare.

Also, and this was the feedback I got in class, why is there no interaction between the frames?  It seems a shame to set up these boundaries around each frame only to respect them!  It’s a decent proof of concept but it needs redoing, and that’s what’s great about being on an academic calendar.  In January, we’ll plan some interframe action and shoot it again, this time in DV.

Encroachment: The Buggiest Software on Earth!

EncroachmentEncroachment was a study for my ICM final that actually turned out to be pretty cool on its own. The online version, which uses the mouse and keyboard instead of a Wiimote is here. Play with it!

I hate cockroaches. But they do lend themselves to creepy, jerky motion, which is exactly what I needed for this particular experiment. I wanted to do two things in a simple sketch form before porting them over to Michelle and my flowering vines:

  1. Orient the cockroaches correctly along their direction of motion knowing only their current x and y positions and those one loop previous.
  2. Get the Wiimote working reliably.

Orienting the roaches was accomplished through trial and error, and there still seems to be some directional ambiguity when certain roaches are moving at ambiguous angles that given my code and slight variations yield arctangents 180 degrees from each other, causing the roach to flash and turn manically. Because the effect conveys a kind of skittishness that I associate with roaches and adds to the program’s overall creepiness, I didn’t try to correct it.

This is the code:

void display() {
    float slope = (y1-y)/(x1-x);
    float theta = atan(slope);
    if(vx<0 && vy>0) rotate(theta-PI/4);
    if(vx>0 && vy>0) rotate(theta+PI/4);
    if(vx>0 && vy<0) rotate(theta+PI/4);
    if(vx<0 && vy<0) rotate(theta-PI/4);

The Wiimote

Getting the Wiimote working with Processing is not terribly difficult. It does, however, require a number of downloads and tweaks. First thing, you need to install the oscP5 library for Processing which allows it, among other things, access serial information over Bluetooth from the Wiimote, which as luck would have it is a Bluetooth device. It’s available here. Then you need to install the interface that allows the Mac and the Wiimote to speak over Bluetooth. I used darwiinosc by Andreas Schlegel which can be downloaded here.

Once both of those are installed, you simply run darwiinosc, connect the Wiimote by holding down buttons 1 and 2, and you should start to see the accelerometer readings graphed in the console. To get Processing to recognize the Wii, you need to import oscP5 library and set up the Wiimote objects you’ll be using in your sketch (buttons, tilt, IR, acceleration, etc). There is clear and exhaustive example code included with the library.

Two things that did take me a little while to figure out were the syntax for getting the buttons to work and the IR tracking. The first two lines of the following code were confusing the hell out of me until I realized that this function contains both the onPress and onRelease actions and that the eponymous boolean variable tells the function which to execute. This is how I got the Wiimote to vibrate only when you pressed the trigger:

void buttonA(int theValue) {
    buttonA = (theValue==1) ? true:false;
    if (buttonA) {
      roaches.add(new Roach(trueX-100,trueY-137));
    else {

The Wiimote is an infrared camera that can track up to four separate infrared LEDs. I’m lazy and only used one, but that did give me kind of a lopsided motion that I had to account for with hard-coded and totally inelegant adjustments. The IR function stores twelve variables in an array, the x and y positions and relative size of each of four possible LEDs. I just used the x and y position of one in a battery-powered sensor bar that I think came from Game Stop, as distance from the screen was not a concern.

Wii GunAfter that was working, all I had to do was find a gun attachment for the Wii that felt enough like a real gun to conjure the visceral emotional connotations we needed for the final project (the realism of the gun for the purpose of creating and killing roaches seems beside the point). This one, which cost about $15, has an ingenious little piece of plastic that slides along the top of the remote and depresses the A button when you pull the upper trigger and a lever on the underside to press the B button when you pull the lower trigger. Perfect!

And here it is projected up on the wall!

After Shocks, or the Misadventures of Happy International

Happy InternationalAfter Effects.  Wow.  I had no idea motion graphics were this fun nor did I suspect they’d be this time consuming.  But that’s also possibly because I went about this all wrong.  Instead of first going to the library to check out books filled with images that took hours and hours to Photoshop into animatability, I should have really worked visually on the story I wanted to tell.

It was frustrating to show my sketch in class and have Marianne say, “Ok, great, but move the camera around, give us multiple shots, you have infinite control, use it.”  We’ve read extensively on frames and points of view, we’ve storyboarded and shot multiple movies, and when finally we’re given total freedom, liberated from the constraints of physical cameras and perspectives, I immediately revert to overhead projector mode.

In any case, the story I was hoping to tell was that of Happy International, bon vivant and rake extraordinaire, who cruises the world in his speedboat picking up female dancers of all ethnicities and taking them on fabulous motorboat cruises of exotic locales with canals/navigable rivers/waterfronts.  I ended up with a paean to the puppet pin tool (really, I can’t gush enough) and a lot of clips that never came together.  I will revisit this particular assignment.

Here’s the painstakingly motion-tracked title sequence, lifted from an old educational video on

And here are Happy and his girls:

Exciting Laos!

Nice Pianist!

After a three days sweating in a tux, lugging heavy lights up forgotten back staircases of the Tisch building and getting coated in dust, punched repeatedly in the hand and abdomen, having to make quite a fuss to get access to a grand piano, and then spending the better part of a week hunched over Final Cut in the video lab, here it is, our small visual sonata.

I was worried that the story wasn’t going to look very good, but Michelle’s color correcting wizardry resulted in exactly the look we wanted. It’s amazing to see a story I dreamed up years ago semi-ported to the screen!

My favorite part of the whole process was watching Michelle lug the camera and tripod all the way up twelve flights of stairs and then balance it precariously over the void between two opposing railings only to not be able to yell loud enough for Zach and me to hear her below (a trombone being played on some intervening floor, possibly the seventh, drowned her out completely).

Once again I’m amazed by the amount of depth and believability sound adds to a scene. Shots that just didn’t cut well together looked flowed naturally once they shared a soundtrack. Three people and a curtain became a full party with the addition of appropriate ambient sounds. And a silly stomp became cringeworthy with the addition of a little crunch.

I can’t imagine the sheer volume of work and organization that goes into a big studio movie, especially one with complicated live-action special effects. And honestly, I’m happy I don’t have to.

« Previous Entries