Archive for November, 2008

Nice Pianist!


After a three days sweating in a tux, lugging heavy lights up forgotten back staircases of the Tisch building and getting coated in dust, punched repeatedly in the hand and abdomen, having to make quite a fuss to get access to a grand piano, and then spending the better part of a week hunched over Final Cut in the video lab, here it is, our small visual sonata.

I was worried that the story wasn’t going to look very good, but Michelle’s color correcting wizardry resulted in exactly the look we wanted. It’s amazing to see a story I dreamed up years ago semi-ported to the screen!

My favorite part of the whole process was watching Michelle lug the camera and tripod all the way up twelve flights of stairs and then balance it precariously over the void between two opposing railings only to not be able to yell loud enough for Zach and me to hear her below (a trombone being played on some intervening floor, possibly the seventh, drowned her out completely).

Once again I’m amazed by the amount of depth and believability sound adds to a scene. Shots that just didn’t cut well together looked flowed naturally once they shared a soundtrack. Three people and a curtain became a full party with the addition of appropriate ambient sounds. And a silly stomp became cringeworthy with the addition of a little crunch.

I can’t imagine the sheer volume of work and organization that goes into a big studio movie, especially one with complicated live-action special effects. And honestly, I’m happy I don’t have to.

Organic Automata, Or Contemplating Countless Hours of Coding

For our ICM final, Michelle and I want to make organic creepers using Processing. I’ve tried to explain the idea to a number of people, but this video does a better job. Pay attention only to the black branches:


Metamorphosis from Glenn Marshall on Vimeo.

Philips also approximated the effect we were looking for in their electronic tattoos:

So what we want to do is:

IDEALLY: Create a program that will generate branching tendrils that creep over a surface like carnivorous vines in a sci-fi jungle movie, blossoming and branching organically (possibly randomly and possibly in response to some sort of environmental or programmatic stimulus). We then want to use a Wiimote or an infrared loaded gun to shoot “seeds” that will start this creeping and branching on a wall we project onto (or even onto a mannequin, so that the user experiences the full intended effect of producing life and beauty with a terribly deadly action). Think Genesis Device in everybody’s favorite Star Trek movieThe Wrath of Khan;

OR POSSIBLY: Create still, high-contrast vector images of the sorts of shapes we want and then mask them. Using Processing, we detect the edges of the various shapes and birth little alpha channel nibbling automata that follow them, turning our mask translucent as they follow the tendrils and their various branches, and rely on semi-random camera movement to mimic true organic generation;

OR MAYBE EVEN: Create many individual high-contrast creeping and branching tendrils as movies in After Effects and then remove their backgrounds and combine them dynamically in Processing (this option seems like it might not work for memory reasons once the number of simultaneous videos exceeds about four).

ICM: Gone With the Wind


After my initial experiments with virtual fans in 3D, I decided to scrap the hemispherical fan movement in favor of planar movement. I hooked up a potentiometer to an analog input on an Arduino and had it setting the fan speed in the Processing sketch over serial. Eventually, I’d like to use an actual fan rather than a pot and an onscreen fan to blow the pixels around; I’m not sure how exactly that would work, but it might make a good PComp final.

Initially, I had trouble keeping track of all the pixels; every so often, one would get lost. I made some extra hard-coded adjustments and now the pixels return to their original positions. In any case, here is a mouse-activated version of the sketch I showed in class.

28 Minutes Later

After a Friday daytrip to Storm King and multiple marathon eight-hour discussions, all eight of us in my Applications group decided that the best of way to react to Lili Cheng’s underwhelming twenty-minute talk about how great her job at Microsoft was by getting our entire class to create something lasting together while having a really good time.

So much of creativity depends on a properly fertile environment. The eight of us created this environment by breaking up our 100 classmates into groups of twelve, running silly energizing exercises to get the lethargic afternoon blood pulsing through their veins, and giving them very specific structural instructions while leaving all content decisions up to them.

Four groups were tasked with creating a 1-minute scene from a zombie film:

  1. 2 ITP students get attacked by zombies;
  2. A zombie chase sequence;
  3. A standoff between zombies and non-zombies;
  4. The last remaining non-zombie succumbs.

And four groups were tasked with recording slightly (but not entirely) random 1-minute sound clips:

  1. A videogame is interrupted by a call from the medical center and an apple is eaten;
  2. A family vacation is interrupted by a natural disaster;
  3. A fight breaks out between two unnamed animals in the style of a documentary;
  4. Your chosen candidate loses in the upcoming presidential election.

The audio was recorded onto M-Audios and the filming was done on Xacti cameras set to black and white. Class started at 4pm and we screened the Director’s cut of the movie (composed on the fly in iMovie) at 5pm. Here is the final version, with a little reordering and a judicious edit or two:

PComp Lab 7: High Current Loads and H-Bridges

This blog has too many pictures of circuits. Here’s Dominique Wilkins in Shanghai with some Chinese fans who didn’t know who he was but knew he must be famous. 我很想中国.

And now, on to the circuits!

DC motors. Fun. I am so not an engineer though. I couldn’t remember in what order the TIP120’s pins are organized off the top of my head, even though we used a passel of them in our midterm. That too I assume will come with time.

I worked with a 3V motor so I’m not sure the current load was high enough to merit the transistor, but I used it anyway, just in case. I ran the motor (which I attached to the gearbox just for fun) with a pot as per the lab instructions, but got bored quickly, so I replaced the pot with a photo sensor and the DC motor with a 1.3V vibrating DC motor. I wasn’t sure how to convert the 3.3V the Arduino outputs into the 1.3V the motor requires using physical components, so I figured that if I pulsed the motor using analogWrite I could approximate 1.3V without damaging the motor. When the light drops below a certain threshold, the Arduino outputs 100 (out of a possible 255) to the motor. Nothing smelled bad or sounded funny, so I’m assuming I figured it out ok. The code is here.

It feels like I’m a step closer to being able to realize my idea of skittish devices that I first explored here.

It took a little doing to get the h-bridge circuit working–I miswired it a couple of times before this:

I kind of zoned out in class while Tom was explaining h-bridges but after putting this circuit together, I understand how they work, which feels pretty good. Two months ago, I was struggling with the idea of a switch. Now, if only I can come up with a final project that works as well as this: