Archive for December, 2008

The 2008 Winter Show

2008 Winter Show PosterThe 2008 ITP Winter Show was pretty incredible. The official count puts the number of visitors at over 2300, and I’d say that a majority of them came away surprised, even if they’d been to an ITP show before. There were helpless robots, hungry robots, drawing robots, a psychedelic bowl, a screen that fogs up when you breathe on it, devices and games to outwit the paparazzi, a prenatal Twitter interface, a spinning florescent sculpture, cootie-catcher pixels, remote sensing pajamas, birds that tweet when you Twitter, a video mosaic, and lots lots more.

T.H.A.W.

UPDATE: NOW WITH SOUND HERE


T.H.A.W. Flyer 1T.H.A.W. was my PComp final. It’s a piece that explores technological determinism, decay, and systems that don’t give you feedback until it’s already too late (think the economy or global warming or girlfriends) or simply a chance to experience the destructive joy of aiming a hairdryer at ice.

Five sound-generating acrylic ice cubes are mounted on a piece of glossy black aluminum. Embedded in each is an RGB LED and a temperature sensor. The temperature determines both the color of the LED and the sound the ice cube produces. Initially, all of the ice cubes are glowing blue and producing soothing natural sounds: crickets on a summer night, a burbling stream, an echoey rainforest beat. When the hairdryer is directed at a cube, it begins to absorb heat and turn red, with a grating industrial/mechanical sound gradually replacing its initial natural sound. But the user can’t hear the change because the drone of the hairdryer drowns it out. S/he’s having too much fun melting all the ice cubes, trying to get them all to stay red as they begin to cool and return to blue. The user turns off the hairdryer and is shocked to hear sirens, traffic, trains, and the rat-a-tat-tat of a nail gun in a factory. Maybe playing whack-a-mole with a hairdryer wasn’t such a good idea after all. But it’s too late now. All s/he can do is wait for the system to gradually cool and return to its earlier state, though it never looks or sounds quite the same again.

It also has a built-in physical toggle that does a conceptual high-low switcheroo, always my favorite. Flick the switch, and all the conceptual headiness gives way to five randomly colored ice cubes that, if heated in the right order, play the opening chords of Van Halen’s Jump. Hair metal with a hairdryer. Now that’s a concept I know we can all get behind.

[I’ll be uploading a better video of the finished product that includes the audio once I get a chance to document it. In the meantime, here are some pictures I took of it and a video from Phil Torrone over at MAKE that shows it working, minus the audio:]

All BlueTwo Red Cubes

PComp Final: T.H.A.W. née Icy Hot

T.H.A.W.The path that led to my PComp final (and Winter Show submission) was circuitous as it was fortuitous, an of exercise in a kind of free association on which I rarely get to follow through.


HOW I GOT THERE

Since wind featured prominently in my work this semester, I started out thinking I would extend the theme by rejigging my ICM midterm to work with a physical fan. Turn on the fan, aim it at an image on a screen, and the pixels are blown around in the direction you’re holding the fan and with a force proportional to how close the fan is to the screen. But when I was discussing the idea with Bridge, she said something along the lines of, “So when the person holds the hairdryer against the screen…”—hairdryer? Whoa.

I haven’t had any significant hair since before puberty, so hairdryers are outside the realm of my ordinary experience. I was envisioning mounting an infrared LED on the end of one of the blades of a little AA-powered handheld fan, the kind that comes in summer camp care packages, and using a camera to track it, determining the distance and the angle by the relative size and shape of the ellipse produced by the spinning light. A simple but elegant solution.

But a hairdryer! A hairdryer is a gun of feminist theory, it’s a pleasant pedal tone that harmonizes smoothly with even the most unpleasant of singing voices while erasing the sour notes, and it’s a sound-barrier that insulates the user from the encroachments of doorbells, dinner calls, and telephones. It boasts two dimensions a fan doesn’t have—noise and temperature. It wouldn’t do to ignore these by using a hairdryer just to blow pixels around a screen.

But how to take advantage of them? I had lots of ideas. Melting something onscreen (ice cubes perhaps?), blow drying virtual hair, a game in which opponents have to move a virtual feather along a screen using hairdryers, a hairdressers’ duel at dawn—most of these felt more like programming projects than explorations of physical computing. Ultimately, I was drawn to the hairdryer because it’s fun to hold, it’s noisy, and it’s very responsive. A good project would necessarily explore and combine each of these aspects.

I’d also been itching to play with MIDI since learning during midterms that it was nothing but a series of numbers sent at a specific baud rate. How about a hairdryer-powered musical interface? That seemed ripe with possibility.

Add a couple of in-class discussions, a really productive crit group show and tell, and several hallway conversations with my classmates, and so T.H.A.W. was born.


MIDI

KORG NX5RMIDI is really fun and surprisingly easy to implement. Basically, it’s a control bit followed by one or two bits that set parameters. For instance, if you want a MIDI device, say a synthesizer like the Korg NX5R I used, to play a note, you send it a bit that tells it what channel to play the note on, another that specifies which note, and a third that determines the volume. Depending on the device, there are also control bits that allow you to choose a sound bank, change instrument, vary the pitch, add effects, and even control esoteric parameters such as attack and decay time.

To get started with MIDI, I read up on the specification here (the tables at the bottom of the page were especially helpful) and then built the hardware interface for the Arduino based on the instructions here and here.

I ran into one problem which I more skirted than addressed. If you turn a note on and off every loop, it doesn’t produce a smooth sound. So I reprogrammed the sounds I was using on the synth to not decay and only turned them on only once, on the first time through the loop. Then I sent control changes linked to the sensor values
to vary the volume. Obviously, this is only possible if you’re using sounds with infinite sustain that don’t vary too much over time. The code looks something like this:

note1 = map(thermValue0,0,4094,0,127);

  if (note1 > threshold) {
  if (startup1) {
    noteOn(0xB0, 0x78, 0x00); //all sound off channel 0
    noteOn(0xB0, 0x00, 0x51); //bank select: programA
    noteMod(0xC0, 19); //select sound 20
    noteOn(0x90, 60,80); //turn the note on at middle C and volume 80 on channel 0
    noteOn(0xB1, 0x78, 0x00); //all sound off channel 1
    noteOn(0xB1, 0x00, 0x00); //bank select: gma
    noteMod(0xC1, 125);// chose program 126
    noteOn(0x91, 70,80); // play the B above middle C at volume 80 on channel 1
    startup1 = false;
  }
  noteOn(0xB0, 0x07, constrain(110-note1,0,127)); //change volume of the note on channel 0
  noteOn(0xB1, 0x07, constrain(note1-20,0,127)); //change volume of the note on channel 1
}

Here was one of my early experiments, controlling the modulation of a note using a potentiometer:



BUILDING IT

Actually constructing T.H.A.W. was the most fun part of the entire process. Once I got the MIDI circuit working, all that was left to do was to replicate it five times to create five different inputs and sound pairs, set up an LED driver to run five RGB LEDs (which have three cathodes and one anode each and thus require more PWM outputs than the Arduino has built in), and decide what the whole thing should look like.

The matter of T.H.A.W.’s appearance was resolved serendipitiously. I was debating taking the subway to school on a nasty rainy Wednesday but decided to walk along Houston and get wet. At the corner of Mott, I saw a piece of enameled black metal—maybe an old shelf—atop a pile of garbage. I went over to inspect and discovered it was a piece of aluminum that would be perfect as the base for my project.

T.H.A.W. Cube

LED and Thermistor

RedFlash2At this point, I wasn’t quite sure at what I’d be aiming the hairdryer, I just knew it would have an LED embedded in it and that it would look great glowing on this shiny black enamel surface. A couple of days later, I went down to Canal Plastics and found these little acrylic ice cubes which screamed, “Stick an LED in me and melt me with a hairdryer!” which is exactly what I did.

For the LED driver, I used a Texas Instruments TLC5940, which has an Arduino library written for it. I spent four hours pulling out what little hair I have trying to get it to work before I realized two things:

  1. RGB LEDs can be either common cathode or common anode. Pay attention! That means that in the first case, you’ll need to supply the LED’s red, green, and blue pins with its own PWM’ed ground (remember, keep the voltage at each ground relative to the voltage coming in the cathode) or, in the second case, you’ll need to supply each pin with its own PWM’ed power.
  2. Any pinMode declarations of Arduino input pins not associated with the TLC5490 need to come before the TLC.init().

And that’s it. I played with the hairdryer and the thermistors to work out a good timing for the heating and cooling, and in order to keep users from heating up the thermistors to the point where it would take them several minutes to cool down, I max out LEDs’ red value early so that after only five or six seconds of hairdrying their color jumps from a kind of pink directly to bright red with a flicker intended to suggest to the user that it would be a good time to move on to another cube.

Here’s the wiring:

T.H.A.W. Arduino

And the construction:

T.H.A.W. Underside

And the final result, sans sound:



Shoots and Leaves

The online version of Shoots and Leaves is here. It uses the mouse and keyboard instead of a Wiimote.

BranchesFor my ICM final, I teamed up with the lovely and talented Michelle Mayer, and together, we set out to create something beautiful out of something repulsive. My initial idea to cause red flowers to explode in a messy splatter all over a screen if you aimed a gun at your own head while standing in front of said screen. That seemed derivative (that t-shirt of Itamar’s with the birds flying out of the guy’s head) and more of a PComp problem, so we restated the challenge as creating life out of death.

The idea became to create an algorithmic seed that once planted, would grow on its own in an unpredictable and unique manner. We planned to project onto a mannequin wearing a white shirt and shoot at it with a Wiimote. But instead of drawing blood, our shots would draw flowering vines. Not to get too meta-geeky here, but the idea and its visualization are more than a little reminiscent of Project Genesis in Star Trek 2: The Wrath of Khan

When creating the effects for the scene, Industrial Light and Magic had to invent a bunch of graphic technologies. And good old retinal scanning, oh brave new world that has such technologies in it!

Anyway, we wanted to do something similar. Our first thought was to use input from a Wiimote to trigger a series of video elements we would have created ahead of time, but Processing’s limitations when dealing with large numbers of simultaneous video clips, the prospect of spending even more time in After Effects, and Dan Shiffman’s encouragement to seek out a programmatic solution convinced us that we might as well program life. Our initial proposal is here.

Creating Branches

The first task we tackled was branching. If our flowering vines were going to be at all realistic, they would have to unpredictably spawn other branches as they grew. This was a matter of setting a series of necessary preconditions for branching and a probability of its occurrence once those preconditions were met. Then all we had to do was pass a point on an existing branch as the origin for a new branch instance. That actually wasn’t so difficult, thought initially, all the existing branches would branch simultaneously and with ever increasing frequency until the whole screen filled up exponentially. Adding a variable in each branch to keep track of its lifetime and randomly changing it upon a successful branch solved that.

Curved Paths

The next challenge was getting the branches to move in nice curvilinear paths that were nonetheless irregular. We spent an entire day playing with sine functions but to no avail, our vines looped like drunken rollercoasters. The solution occurred to me right as I was going to bed one night. Taking my cue from Craig Kapp’s brilliant gravity simulator, I thought, why not have a bunch of balls that exert a force on the growing branches bouncing around invisibly in the background? Have three, say, and when a branch is born, assign it to follow one randomly. Make the force proportional to the square of the distance and it should yield random-looking curved paths. And it did!

Aging: A Perennial Problem

Next we decided we wanted our branches to thicken with age, as they would in real life. This we accomplished by storing each branch’s last fifty x and y positions and then drawing fifty successively larger semi-transparent ellipses at each. This creates the illusion of a thickening that follows the branch’s sprout. It also slows things down a fair amount because of the memory it requires.

Switch Cases and Flowers

Our final design step involved implementing flowers and leaves (and little twirly tendrils which in our multi-day programming orgy we never quite figured out). Since the basic conditions for branching are no different from the conditions for sprouting leaves or flowering, we implemented a switch case that favored leaves:

          if (b.check()){  // If a branch hasn't just branched
            if (b.branchcount < 20) {  // and it hasn't already branched more than 20 times
              int chance = round(random(0,3)); // pick one of the following cases randomly
              switch(chance) {
              case 0:
                b.branch();  // spawn a new branch
                b.branchcount++;
                b.lifetime = -b.branchcount * 200;  // delays the time its going to take for the next new sprout from the same branch
                break;
              case 1:
                b.flower();
                b.branchcount++;
                b.lifetime = -b.branchcount * 200;
                break;
              case 2:  // two leaf cases to ensure more leaves than flowers and branches
                b.leaf();
                b.branchcount++;
                break;
              case 3:
                b.leaf();
                b.branchcount++;
                break;
              }
            }
            else {
              branches.remove(b);
            }

          }

We addressed several interesting smaller problems (rotating the leaves using the arctangent function to ensure that they grew according to the direction in which the branch was moving, implementing Wiimote control, and the eventually discarded use of real images of leaves and flowers instead of programmatically drawn ones) in Encroachment, which is documented here.

Here's Michelle showing the project on a wall at ITP:

The Final Straw


For my last Commlab assignment, I abandoned Happy and After Effects and returned to stop motion, which was my favorite technique among all the ones we played with this semester.  There is something so satisfying about taking a format with which I’m very comfortable (nice static Photoshopable photographs) and transmogrifying it by virtue of nothing other than repetition  into animation, a format that until recently provoked cold sweats.

Bralex

I wanted also to revisit this photograph, which was part of an invitation that I made for the party Bridge and I threw to announce our arrival in New York (that sounds so grand, but that we threw a party doesn’t mean that anyone actually caught it).

Bridge and I discussed the idea and she thought it would be fun to film a discombobulated argument.  I had recently listened to the White Stripes song “There’s No Home for You Here” which I thought would make a great soundtrack.

So we set up the tripod and a bunch of lights in the apartment and separately photographed our eyes and our mouths as the song played.  I stitched the resulting photos into a four-frame collage in Final Cut.  I’m not all that happy with it.  It’s too slow; I should have taken about three times as many photos, though I’m not entirely convinced that the whole thing shouldn’t just be done in video to begin with. And syncing the sound was a total nightmare.

Also, and this was the feedback I got in class, why is there no interaction between the frames?  It seems a shame to set up these boundaries around each frame only to respect them!  It’s a decent proof of concept but it needs redoing, and that’s what’s great about being on an academic calendar.  In January, we’ll plan some interframe action and shoot it again, this time in DV.

Encroachment: The Buggiest Software on Earth!

EncroachmentEncroachment was a study for my ICM final that actually turned out to be pretty cool on its own. The online version, which uses the mouse and keyboard instead of a Wiimote is here. Play with it!

I hate cockroaches. But they do lend themselves to creepy, jerky motion, which is exactly what I needed for this particular experiment. I wanted to do two things in a simple sketch form before porting them over to Michelle and my flowering vines:

  1. Orient the cockroaches correctly along their direction of motion knowing only their current x and y positions and those one loop previous.
  2. Get the Wiimote working reliably.

Orienting the roaches was accomplished through trial and error, and there still seems to be some directional ambiguity when certain roaches are moving at ambiguous angles that given my code and slight variations yield arctangents 180 degrees from each other, causing the roach to flash and turn manically. Because the effect conveys a kind of skittishness that I associate with roaches and adds to the program’s overall creepiness, I didn’t try to correct it.

This is the code:

void display() {
    float slope = (y1-y)/(x1-x);
    float theta = atan(slope);
    pushMatrix();
    translate(x+40,y+50);
    scale(size);
    if(vx<0 && vy>0) rotate(theta-PI/4);
    if(vx>0 && vy>0) rotate(theta+PI/4);
    if(vx>0 && vy<0) rotate(theta+PI/4);
    if(vx<0 && vy<0) rotate(theta-PI/4);
    image(lilroach,-150,-197);
    popMatrix();
    x=x1;
    y=y1;
  }

 
The Wiimote

Getting the Wiimote working with Processing is not terribly difficult. It does, however, require a number of downloads and tweaks. First thing, you need to install the oscP5 library for Processing which allows it, among other things, access serial information over Bluetooth from the Wiimote, which as luck would have it is a Bluetooth device. It’s available here. Then you need to install the interface that allows the Mac and the Wiimote to speak over Bluetooth. I used darwiinosc by Andreas Schlegel which can be downloaded here.

Once both of those are installed, you simply run darwiinosc, connect the Wiimote by holding down buttons 1 and 2, and you should start to see the accelerometer readings graphed in the console. To get Processing to recognize the Wii, you need to import oscP5 library and set up the Wiimote objects you’ll be using in your sketch (buttons, tilt, IR, acceleration, etc). There is clear and exhaustive example code included with the library.

Two things that did take me a little while to figure out were the syntax for getting the buttons to work and the IR tracking. The first two lines of the following code were confusing the hell out of me until I realized that this function contains both the onPress and onRelease actions and that the eponymous boolean variable tells the function which to execute. This is how I got the Wiimote to vibrate only when you pressed the trigger:

void buttonA(int theValue) {
    buttonA = (theValue==1) ? true:false;
    if (buttonA) {
      roaches.add(new Roach(trueX-100,trueY-137));
      forcefeedback(true);
    }
    else {
    forcefeedback(false);
    }
  }

The Wiimote is an infrared camera that can track up to four separate infrared LEDs. I’m lazy and only used one, but that did give me kind of a lopsided motion that I had to account for with hard-coded and totally inelegant adjustments. The IR function stores twelve variables in an array, the x and y positions and relative size of each of four possible LEDs. I just used the x and y position of one in a battery-powered sensor bar that I think came from Game Stop, as distance from the screen was not a concern.

Wii GunAfter that was working, all I had to do was find a gun attachment for the Wii that felt enough like a real gun to conjure the visceral emotional connotations we needed for the final project (the realism of the gun for the purpose of creating and killing roaches seems beside the point). This one, which cost about $15, has an ingenious little piece of plastic that slides along the top of the remote and depresses the A button when you pull the upper trigger and a lever on the underside to press the B button when you pull the lower trigger. Perfect!

And here it is projected up on the wall!

After Shocks, or the Misadventures of Happy International

Happy InternationalAfter Effects.  Wow.  I had no idea motion graphics were this fun nor did I suspect they’d be this time consuming.  But that’s also possibly because I went about this all wrong.  Instead of first going to the library to check out books filled with images that took hours and hours to Photoshop into animatability, I should have really worked visually on the story I wanted to tell.

It was frustrating to show my sketch in class and have Marianne say, “Ok, great, but move the camera around, give us multiple shots, you have infinite control, use it.”  We’ve read extensively on frames and points of view, we’ve storyboarded and shot multiple movies, and when finally we’re given total freedom, liberated from the constraints of physical cameras and perspectives, I immediately revert to overhead projector mode.


In any case, the story I was hoping to tell was that of Happy International, bon vivant and rake extraordinaire, who cruises the world in his speedboat picking up female dancers of all ethnicities and taking them on fabulous motorboat cruises of exotic locales with canals/navigable rivers/waterfronts.  I ended up with a paean to the puppet pin tool (really, I can’t gush enough) and a lot of clips that never came together.  I will revisit this particular assignment.

Here’s the painstakingly motion-tracked title sequence, lifted from an old educational video on archive.org:

And here are Happy and his girls:

Exciting Laos!