16 Seconds of Sunset, 2010

Imagine if all the frames in a 16 second video were printed and stacked on top of each other. Hold this three dimensional block in your hands and rotate it so that instead of looking at the first frame, you are now looking at just the edges of the frames. That's what you are seeing in this video. A single frame in this video shows a row of pixels from every frame in the original sunset video. The top row of pixels in this new video is the leftmost column of pixels from the first frame of the original video. Think of it like a book that you are looking at from the side. You only see the edges of the pages instead of the cover. Now slice that book so that each slice makes a new frame, and each frame contains a row of pixels from every frame in the original video.

The music is an algorithmic composition, meaning that the actual music wasn't explicitly composed, but generated from aspects of the video. In this case the input to the algorithm were the RGB values of a pixel. Those numbers defined the notes and rhythm, and the shifting values of that pixel's color determined what was played. It's a three part counterpoint, one part each for Red, Green, and Blue.

Video Description:

Vibrant oranges, light blues, and deep greys undulate as the music plays. The colors shift, mesh, and roll within each other. As the music crescendos, black shadowed distorted trees punctuate the screen and slowly become the centerpiece of the video as the orange and blue is pushed away. As the music ends the scene fades to black and the music ends abruptly.