Archive for the 'PComp' Category

« Previous Entries

PComp Final: T.H.A.W. née Icy Hot

T.H.A.W.The path that led to my PComp final (and Winter Show submission) was circuitous as it was fortuitous, an of exercise in a kind of free association on which I rarely get to follow through.


Since wind featured prominently in my work this semester, I started out thinking I would extend the theme by rejigging my ICM midterm to work with a physical fan. Turn on the fan, aim it at an image on a screen, and the pixels are blown around in the direction you’re holding the fan and with a force proportional to how close the fan is to the screen. But when I was discussing the idea with Bridge, she said something along the lines of, “So when the person holds the hairdryer against the screen…”—hairdryer? Whoa.

I haven’t had any significant hair since before puberty, so hairdryers are outside the realm of my ordinary experience. I was envisioning mounting an infrared LED on the end of one of the blades of a little AA-powered handheld fan, the kind that comes in summer camp care packages, and using a camera to track it, determining the distance and the angle by the relative size and shape of the ellipse produced by the spinning light. A simple but elegant solution.

But a hairdryer! A hairdryer is a gun of feminist theory, it’s a pleasant pedal tone that harmonizes smoothly with even the most unpleasant of singing voices while erasing the sour notes, and it’s a sound-barrier that insulates the user from the encroachments of doorbells, dinner calls, and telephones. It boasts two dimensions a fan doesn’t have—noise and temperature. It wouldn’t do to ignore these by using a hairdryer just to blow pixels around a screen.

But how to take advantage of them? I had lots of ideas. Melting something onscreen (ice cubes perhaps?), blow drying virtual hair, a game in which opponents have to move a virtual feather along a screen using hairdryers, a hairdressers’ duel at dawn—most of these felt more like programming projects than explorations of physical computing. Ultimately, I was drawn to the hairdryer because it’s fun to hold, it’s noisy, and it’s very responsive. A good project would necessarily explore and combine each of these aspects.

I’d also been itching to play with MIDI since learning during midterms that it was nothing but a series of numbers sent at a specific baud rate. How about a hairdryer-powered musical interface? That seemed ripe with possibility.

Add a couple of in-class discussions, a really productive crit group show and tell, and several hallway conversations with my classmates, and so T.H.A.W. was born.


KORG NX5RMIDI is really fun and surprisingly easy to implement. Basically, it’s a control bit followed by one or two bits that set parameters. For instance, if you want a MIDI device, say a synthesizer like the Korg NX5R I used, to play a note, you send it a bit that tells it what channel to play the note on, another that specifies which note, and a third that determines the volume. Depending on the device, there are also control bits that allow you to choose a sound bank, change instrument, vary the pitch, add effects, and even control esoteric parameters such as attack and decay time.

To get started with MIDI, I read up on the specification here (the tables at the bottom of the page were especially helpful) and then built the hardware interface for the Arduino based on the instructions here and here.

I ran into one problem which I more skirted than addressed. If you turn a note on and off every loop, it doesn’t produce a smooth sound. So I reprogrammed the sounds I was using on the synth to not decay and only turned them on only once, on the first time through the loop. Then I sent control changes linked to the sensor values
to vary the volume. Obviously, this is only possible if you’re using sounds with infinite sustain that don’t vary too much over time. The code looks something like this:

note1 = map(thermValue0,0,4094,0,127);

  if (note1 > threshold) {
  if (startup1) {
    noteOn(0xB0, 0x78, 0x00); //all sound off channel 0
    noteOn(0xB0, 0x00, 0x51); //bank select: programA
    noteMod(0xC0, 19); //select sound 20
    noteOn(0x90, 60,80); //turn the note on at middle C and volume 80 on channel 0
    noteOn(0xB1, 0x78, 0x00); //all sound off channel 1
    noteOn(0xB1, 0x00, 0x00); //bank select: gma
    noteMod(0xC1, 125);// chose program 126
    noteOn(0x91, 70,80); // play the B above middle C at volume 80 on channel 1
    startup1 = false;
  noteOn(0xB0, 0x07, constrain(110-note1,0,127)); //change volume of the note on channel 0
  noteOn(0xB1, 0x07, constrain(note1-20,0,127)); //change volume of the note on channel 1

Here was one of my early experiments, controlling the modulation of a note using a potentiometer:


Actually constructing T.H.A.W. was the most fun part of the entire process. Once I got the MIDI circuit working, all that was left to do was to replicate it five times to create five different inputs and sound pairs, set up an LED driver to run five RGB LEDs (which have three cathodes and one anode each and thus require more PWM outputs than the Arduino has built in), and decide what the whole thing should look like.

The matter of T.H.A.W.’s appearance was resolved serendipitiously. I was debating taking the subway to school on a nasty rainy Wednesday but decided to walk along Houston and get wet. At the corner of Mott, I saw a piece of enameled black metal—maybe an old shelf—atop a pile of garbage. I went over to inspect and discovered it was a piece of aluminum that would be perfect as the base for my project.

T.H.A.W. Cube

LED and Thermistor

RedFlash2At this point, I wasn’t quite sure at what I’d be aiming the hairdryer, I just knew it would have an LED embedded in it and that it would look great glowing on this shiny black enamel surface. A couple of days later, I went down to Canal Plastics and found these little acrylic ice cubes which screamed, “Stick an LED in me and melt me with a hairdryer!” which is exactly what I did.

For the LED driver, I used a Texas Instruments TLC5940, which has an Arduino library written for it. I spent four hours pulling out what little hair I have trying to get it to work before I realized two things:

  1. RGB LEDs can be either common cathode or common anode. Pay attention! That means that in the first case, you’ll need to supply the LED’s red, green, and blue pins with its own PWM’ed ground (remember, keep the voltage at each ground relative to the voltage coming in the cathode) or, in the second case, you’ll need to supply each pin with its own PWM’ed power.
  2. Any pinMode declarations of Arduino input pins not associated with the TLC5490 need to come before the TLC.init().

And that’s it. I played with the hairdryer and the thermistors to work out a good timing for the heating and cooling, and in order to keep users from heating up the thermistors to the point where it would take them several minutes to cool down, I max out LEDs’ red value early so that after only five or six seconds of hairdrying their color jumps from a kind of pink directly to bright red with a flicker intended to suggest to the user that it would be a good time to move on to another cube.

Here’s the wiring:

T.H.A.W. Arduino

And the construction:

T.H.A.W. Underside

And the final result, sans sound:

PComp Lab 7: High Current Loads and H-Bridges

This blog has too many pictures of circuits. Here’s Dominique Wilkins in Shanghai with some Chinese fans who didn’t know who he was but knew he must be famous. 我很想中国.

And now, on to the circuits!

DC motors. Fun. I am so not an engineer though. I couldn’t remember in what order the TIP120’s pins are organized off the top of my head, even though we used a passel of them in our midterm. That too I assume will come with time.

I worked with a 3V motor so I’m not sure the current load was high enough to merit the transistor, but I used it anyway, just in case. I ran the motor (which I attached to the gearbox just for fun) with a pot as per the lab instructions, but got bored quickly, so I replaced the pot with a photo sensor and the DC motor with a 1.3V vibrating DC motor. I wasn’t sure how to convert the 3.3V the Arduino outputs into the 1.3V the motor requires using physical components, so I figured that if I pulsed the motor using analogWrite I could approximate 1.3V without damaging the motor. When the light drops below a certain threshold, the Arduino outputs 100 (out of a possible 255) to the motor. Nothing smelled bad or sounded funny, so I’m assuming I figured it out ok. The code is here.

It feels like I’m a step closer to being able to realize my idea of skittish devices that I first explored here.

It took a little doing to get the h-bridge circuit working–I miswired it a couple of times before this:

I kind of zoned out in class while Tom was explaining h-bridges but after putting this circuit together, I understand how they work, which feels pretty good. Two months ago, I was struggling with the idea of a switch. Now, if only I can come up with a final project that works as well as this:

PComp midterm in the key of C: Keeping Track of the Spin

For our midterm, Patrick Grizzard, Ted Hayes, and I built a data auralizer.

We started discussing ideas around wind chimes and (with the help of significant caffeine) eventually ended up at SpinTone: an array of ten 70 mm 6W computer fans, each attached to a news outlet and mounted over a Smart Water bottle that resonates with a conch-like sound whenever its corresponding fan turns on.

It should be noted here that for some reason (probably the frequencies involved), none of the microphones we used to record the SpinTone were able to pick up the sound of the bottles over the noise of the fans. You can almost hear them in this video:

By varying the size and the amount of water in the bottle attached to each fan, we created internally consonant but mutually dissonant sounds for the liberal and conservative media outlets. So you get a nice harmonic interval if the Huffington Post and the New York Times are talking about something, but nasty noise when Fox New joins the conversational fray.

When a user inputs one or various search terms, a program we wrote in Processing uses the Yahoo! search API to query each of the sites and return the total number of search results. Our goal was to have some cutoff point that would determine whether an individual fan turned on or stayed off, but because the extent of each news source’s online archive varied tremendously, so did our totals. For instance, a search for a common term such as “France” in the New York Times tended to return between 100,000 and several million results, while the same term returned a fraction of the results from theWall Street Journal–not because France was being discussed disproportionately more in the New York Times but because the Times‘s online archive is much more extensive. We hacked together a somewhat arbitrary set of scaling factors; a more robust version would delimit searches by date range, something we were unable to do through the Yahoo! API.

There is a ten-second delay as the program gets results from each of the sites. Once the results are in, Processing interprets them and sends them serially to an Arduino running this code. The fans attached to outlets which return a number of search results above the cutoff turn on.

This is very nice, but it’s also fun to play the fans like an instrument (a news organ?) using the computer keyboard:

The nice thing about SpinTone is that it is pretty much infinitely extensible: it’s incredibly easy to change the sites each fan is linked to and because it relies on Yahoo! search rather than XML feeds, if a site’s online, SpinTone can play its results. Some possible mods:

  • I’m Your Biggest Fan: Rather than being linked to a particular site, each fan is linked to a particular celebrity across gossip sites. For when you have to know if Paris or Nicole is hotter right now.
  • Baseball Blowout: Each fan represents a particular game and one glance is enough to tell you who’s up.
  • Election 2008: The fans keep track of election data as it comes in from various sources. Know who’s calling what when without switching channels!

Some other related videos:

Initial Proof of Concept (can’t hear the bottle):

Ted’s initial fan test:

All wired up:

The final wiring with a detailed explanation:

With LED’s (and good bottle sound):

PComp Lab 6: Ongoing serial

This lab was straightforward and the concepts not all that difficult, at least after having gone through them in class first. After trying both methods for serially communicating, I definitely find the punctuation method the most intuitive, though I can see the advantages of call and response and will probably rely on it rather than punctuation when I mock up our midterm project software this week.

There’s not much to document other than actually having done the lab. I used a potentiometer and a photo sensor as my analog inputs and a two-state switch instead of a button (I left my buttons at home). Here is my wiring:

Here is an artistic shot of my wiring (which I’m definitely starting to actually understand now):

Here is the screen showing all the values I got when using the punctuation method:

And here are all the values that the handshake method yielded:

PComp Lab 5: Serial communicator

Though simple, this lab was my favorite so far. Finally I’m beginning to see how we take input from the real world and make it do fun digital things. It gave me lots of ideas, principal among them this: what if you wired a handheld fan in such a way that when someone held it next to a screen leaves/dots/pinwheels on the screen responded accordingly? You could sense distance and direction. I really want to try that. Anyway, it was a great joy to finally arrange a liaison between Arduino and oh so coy Processing.

I set up the lab looking only at the schematic (I’m trying to get better at reading diagrams), which wasn’t particularly challenging given the simplicity of this particular wiring. I did manage to wire my potentiometer so that the readings increased when it was turned counterclockwise and decreased when it was turned clockwise. Switching the power and ground connections fixed that.



I got the expected gobbledygook on the serial monitor.


And then I wrote up the graphing app in Processing, which outputted the following when I turned the potentiometer knob back and forth:


But that didn’t look so interesting, so by altering the Ardiuno code to


I got a more colorful graph:

I tried a photo sensor:

And a force sensor

The force sensor’s range seemed too narrow, so I mapped its maximum and minimum values to 255 and 0

analogValue = map(analogValue, 0, 50, 0, 255);

and got a much noisier graph:

To do: I’m going to get the values to display as numbers every so often on the graph and I’d really like to rewrite the code so that the screen scrolls rather than the graph (possibly by storing each value in an array and then shifting all the values over each time a new value is read, at least that’s what I’m thinking now). Also, I really should have tried to get a video screenshot of the graph moving rather than the various stills.

Lab 4: I live but to servo

What a cool lab. As with all of the work in pcomp so far, I understand the code and I get how to apply it, but I’m still finding it frustrating that I would never in a million years be able to figure out how to write it myself. Say for instance Tom gave us the lab with no code–I’m pretty sure I wouldn’t be able to get the motor working. I feel a little bit like I’m just beginning to learn a new musical instrument or a new language: I’m playing other people’s pieces and repeating set phrases but am still miles from being able to compose something of my own. But at least I’m playing rather than learning scales and gerunds.

Which is good I guess, since theory always seems to catch me up. When I first attempted electronics in college with Paul Horowitz, I tried to follow the electrons around their circuitous routes and my brain short-circuited. I like ITP’s get your hands dirty first approach, though a little expectation management at the outset in terms of understanding would be reassuring–you’re going to feel lost, but don’t worry, it’s a good kind of lost, that kind of thing.

Setting up this lab was not particularly difficult.

I wired in a pot as per the instructions and then tried a bunch of other sensors, including a force sensor

and a photo sensor.

Each needed some adjusting of the maximum and minimum values being passed to the map() function. The photo sensor was by far the coolest, with a kind of action at a distance feel.

The photo sensor got me thinking about my project for this week. I think I’m going to build some variation on hide-and-seek. This weekend when I was playing said game with my four-year-old cousin, and felt a pump of adrenaline every time she was about to find me. There’s a nice visceral component to the game, a primordial hunter and prey dynamic that might lend itself nicely to an interface. I wanted my project this week to explore whether it’s possible to mimic this kind of play and get a similar response from a device that hides from you, but I think the servo is too slow. It may make a good haunted house project… I am however still toying with the idea of an interface that tries to avoid the user, possibly a small box covered in a stretchy material (maybe nylon stocking?) with a plunger at one end that rises and stretches the material when a photo sensor is getting lots of light and falling into the box when someone’s hand approaches. When Tom made a switch fade an LED when he held it down, I could really feel the switch (for some reason, I felt suction). I’m hoping that the interface I’m thinking of building will likewise have an associated feel.

Hide and seek also for some reason made me think of mushrooms, popping up at night and disappearing in the morning. Which made me think of sunflowers. If my hide and seek idea doesn’t work, I may build a sunflower that follows the light.

Ah the junk shelf! I found a perfect housing for my sheepish stretcher (working title). It’s some sort of computer part packaging.

I’ve made a cam with the spindle from a CD spindle.

I’ve hooked it all up to a servo that’s connected to a photo sensor. The initial result using the lab code is too bold; I need my little plunger to be more skittish. So I’ve adjusted as follows, reversing the minPulse and maxPulse to account for my motor position (NTS: test the servo before gluing it down). It works!

I also replaced the high and low bounds for the sensor in the map function with a variable that is initialized the first time the program runs to make sure that the sheepish stretcher works in all light conditions.

int servoPin = 2;     // Control pin for servo motor
int minPulse = 2500;   // Minimum servo position
int maxPulse = 500;  // Maximum servo position
int pulse = 0;        // Amount to pulse the servo
int highValue = 0;  // Initialization value for unobstructed light
boolean intro = true;  // Switch variable for initialization

long lastPulse = 0;    // the time in milliseconds of the last pulse
int refreshTime = 20; // the time needed in between pulses

int analogValue = 0;  // the value returned from the analog sensor
int analogPin = 0;    // the analog pin that the sensor's on

void setup() {
 pinMode(servoPin, OUTPUT);  // Set servo pin as an output pin
 pulse = minPulse;           // Set the motor position value to the minimum

void loop() {
 if(intro == true) {   
 highValue = analogRead(analogPin);
 intro = false;
 analogValue = analogRead(analogPin);      // read the analog input
 pulse = constrain(map(analogValue,highValue-60,highValue,minPulse,maxPulse), 500, 2500);
 // convert the analog value and constrain it so that the servo doesn't roll over at high values

 // pulse the servo again if the refresh time (20 ms) have passed:
 if (millis() - lastPulse >= refreshTime) {
   digitalWrite(servoPin, HIGH);   // Turn the motor on
   delayMicroseconds(pulse);       // Length of the pulse sets the motor position
   digitalWrite(servoPin, LOW);    // Turn the motor off
   lastPulse = millis();           // save the time of the last pulse

Then I tested the effect using the stretchy book cover I bought at Duane Reade:

I put processor and all the wiring inside the bottom of the casing:

The final result is pretty sweet.

Lab 3: It’s electric!

NTS: The soldering irons in the lab suck.

I borrowed Aaron’s “12V” DC power supply, in quotes because it actually outputs 17V!

The voltage regulator, however, does it’s job and outputs pretty close to 5V.

With two LEDs in series and no resistor, I measured a voltage drop of 2.47V across one LED and an almost identical drop of 2.52V across the other. I’m assuming they didn’t burn out because each acts as a resistor for the other.

It was too hard to use the multimeter while holding down a push button, so I replaced it with a switch that would stay on or off without my intervention.

A mistake when wiring in series.

Though initially my LEDs lit up erratically as shown above, I realized that I had wired both pins of the second into ground and corrected the problem. The three LEDs then didn’t light up because they split 5V three ways and apparently 1.7V will not power an LED of this kind.

I measured the 4.97V across all three LEDs when wired in parallel but my voltage regulator overheated (smelled not good) and suddenly the voltage dropped to 3V (measured across the voltage regulator as well as the LEDs). I let it cool off and it resumed normal operation.

I had to twist wires around my meter probes because they wouldn’t fit in the board—this also made taking pictures much easier!

The pot did not increase and decrease voltage particularly smoothly. It jumped from 3V to maximum (4.7V or so) right at the end. I tried another pot and it behaved the same. I’m not sure why this is. It could because people perceive volume and other things that are traditionally controlled using potentiometers logarithmically. It could also have a purely physical explanation that has to do with the material out of which the resistive material in the pot’s innards. In any case, I’m not sure I’ve figured out this particular puzzle, but electronics in general is seeming a whole lot less puzzling.

When left to their devices…

We ambled from school down Broadway and then through Soho and Chinatown (returning via Astor Place on the subway) at lunchtime on a Wednesday. We stopped for lunch at an outdoor taquería halfway through the exercise with the intention to write up our hastily scribbled notes, but it ended up being the richest part of our safari. People when left to their devices…

WHAT: Man using ATM
WHEN & WHERE: 12 pm: Chase bank on the corner of Houston and Broadway
APPARENT INTENT: To get money out, probably for lunch
TIME TAKEN: About a minute
REQUISITE MOTOR SKILLS: Hands for swiping the card and interacting with the touchscreen, eyes for reading the screen

WHAT: Two men using a Blackberry and an iPhone while waiting for a table at La Esquina
WHEN & WHERE: 12.14 pm: La Esquina, Kenmare St.
APPARENT INTENT: Checking email to while away the wait and avoid awkward conversation; they appeared to be colleagues rather than friends
TIME TAKEN: About a minute
# OF PEOPLE INVOLVED: 2, though each was using his device separately

WHAT: A man strumming an iPhone running a guitar emulation app for the delight of his friends, one of whom was taking a picture of the scene on his own iPhone and another who was simply holding his
WHEN & WHERE: 12.29 pm: La Esquina, Kenmare St.
TIME TAKEN: 20 seconds
# OF PEOPLE INVOLVED: 1 strummer, 1 photographer, 1 appreciative observer
REQUISITE MOTOR SKILLS: Hands for strumming and photographing and grasping, sight for operating the camera and appreciating the sight gag, ears for hearing the sound of the shutter and the virtual guitar

WHAT: Waiter using an outdoor cash register
WHEN & WHERE: 12.44 pm: La Esquina, Kenmare St.
APPARENT INTENT: To enter a sale and get somebody’s check
TIME TAKEN: Several seconds-long bursts of action
# OF PEOPLE INVOLVED: 1 operator, around 20 or 30 people providing “data.”
REQUISITE MOTOR SKILLS: Hands for punching in numbers and making change, eyes for reading the display

WHAT: Two guys in helmets and harnesses getting lowered from a billboard in a wobbly cherry picker
WHEN & WHERE: 12.47 pm: Corner of Centre Market Place and Broome
APPARENT INTENT: To get down from the billboard where they’d just replaced an ad without dying
TIME TAKEN: Five minutes from when we arrived
# OF PEOPLE INVOLVED: 1 operator manning the controls from the ground, 2 guys dangling precariously at his mercy
REQUISITE MOTOR SKILLS: Manual dexterity for controlling the cherry picker, spatial perception, feet, balance, hearing to communicate from the ground to the billboard

WHAT: Courier using a fancy intercom system and calling on his phone from in front of an office building
WHEN & WHERE: 12.51 pm: Grand and Lafayette
APPARENT INTENT: To enter the building to make a delivery (when the doorbell didn’t work, he tried calling; eventually he went in the door when somebody came out)
TIME TAKEN: 50 seconds
REQUISITE MOTOR SKILLS: Fingers for punching the button, eyes for identifying the right one, speaking and hearing to communicate across the intercom (both users)

WHAT: A man taking a digital photo of his girlfriend standing beneath a street sign
WHEN & WHERE: 12.53 pm: Canal and Lafayette
APPARENT INTENT: To record a memory and proof of visiting a location
TIME TAKEN: 10 seconds
REQUISITE MOTOR SKILLS: Hands to operate the camera and eyes to frame the picture

WHAT: Man with luggage struggling with the card reader on a subway turnstile
WHEN & WHERE: 1.04 pm: Canal Street subway
APPARENT INTENT: To get through the turnstile and catch a train
TIME TAKEN: Around a minute
# OF PEOPLE INVOLVED: 1 initially until he was joined by a cop who helped him
REQUISITE MOTOR SKILLS: Hands for sliding card, ears for hearing card reader feedback, sight for looking at the display, voice for swearing

WHAT: A man playing an electric guitar and singing House of the Rising Sun
WHEN & WHERE: 1.15 pm: Astor Place subway
APPARENT INTENT: To entertain, to make some spare change, to enjoy himself, to make himself heard over loud trains
TIME TAKEN: We watched for 4 minutes
# OF PEOPLE INVOLVED: 1 plus an active audience of 3 and a passive audience of around 20
REQUISITE MOTOR SKILLS: Hands for playing and adjusting knobs, voice for singing, hearing

WHAT: Man refilling Metrocard
WHEN & WHERE: 1.19 pm: Astor Place subway
APPARENT INTENT: To add money to his Metrocard
TIME TAKEN: 20 seconds
REQUISITE MOTOR SKILLS: Hands and eyes for interacting with the touchscreen

WHAT: Alex setting off anti-theft alarm at Walgreen’s
WHEN & WHERE: 1.23 pm: Astor Place Walgreen’s
APPARENT INTENT: None (though taking something without paying was initially suspected)
TIME TAKEN: 1 minute
# OF PEOPLE INVOLVED: 1 plus the cashier that waved him through
REQUISITE MOTOR SKILLS: Ears for hearing the alarm

WHAT: Escalade pumping really loud hip-hop
WHEN & WHERE: 12.40 pm: Broadway right outside of Tisch
APPARENT INTENT: To draw attention, to feel skull vibrate, to share musical taste with others
TIME TAKEN: A minute or so before the light changed and he zoomed off
# OF PEOPLE INVOLVED: 1 driver, 2 passengers, and a block worth of pedestrians
REQUISITE MOTOR SKILLS: Hands for tuning the radio and turning up the volume, sight to see the display, hearing

In addition, there are a couple of technological interactions we quickly gave up recording on a case by case basis as they were ubiquitous.

The first was anything to do with talking on the phone. A good thirty percent of the people we passed on the street were either talking on their phones or looking at them intently or just holding them. Their intentions varied from killing time to checking the exact location of a meeting to catching up with a friend or colleague to complaining to just having something in their hands. The most interesting thing we noted was that people wearing watches routinely checked the time on their phones rather than on their wrists.

Phones were plentiful but digital music players were even more so. It seemed that anyone who was walking alone was wearing earbuds and bopping alone to a private beat. The primary intention in this case was to replace the harsh sounds of the city with a mellifluous soundtrack of choice, though we suspected an ulterior intention to signal that the wearer was not to be disturbed.

At every intersection we crossed, we and everyone else interacted with traffic lights. The traffic lights were interpreted by most pedestrians as a suggestion for caution when crossing rather than a steadfast directive. The motorists fortunately favored the latter interpretation. Interactions varied from quick glances to long hateful stares, and in one case, to a disproportionately hateful stream of invective.

« Previous Entries