Archive for the 'video' Category

Jan
28

daotw

I projected some live reactive video filters for the Dragging an Ox through Water show last week. Above is a photo of the “Bubble Filter” in action. 

visuals

Above are stills from four of the filter modes. Modes can be selected by pressing the 1-5 keys or the “b” and “w” keys. The modes are as follows:

1: Unfiltered video
2: Black and White Bubble Filter
3: Colored Bubble Filter
4: Grey/Red Frame Difference Filter (based on the “Frame Differencing” sketch by Golan Levin)
5: Grey/Red Frame Difference Filter w/noise (based on the “Frame Differencing” sketch by Golan Levin)
B: Black screen
W: White screen

If you have a built in video camera on your computer, you can load up the sketch in Processing and start playing with live filtered video. 

Processing Code

Jan
10

daotw

I’m putting together this show and it’s going to be amazing! Brian of Dragging an Ox is one of my favorite musicians and he puts on a great show. I’ll be putting together a reactive video piece for the show as part of the DANM exhibition.  

Continue reading »

Dec
4

In this program, each bubble has a note (in C major) associated with it. When a silhouette in the video feed overlaps with the bubble, the note is sent to an audio program (Reason, in this case) in the form of a midi note. The bubbles can be moved left and right using an Arduino equipped with an accelerometer.

Processing Code

Similar to the above, but now two video feeds are incorporated. Now bubbles can only be played if they overlap with red sections of the video. Red sections are created when two silhouettes from separate video feeds overlap, so cooperation between people in the two video feeds is necessary to create notes. 

Processing Code

Both these programs build on the programs from my previous two posts. 

Dec
3

In this project, people in two separate rooms are videotaped. The two video feeds are fed through Processing, converted into silhouettes and superimposed over each other. Both rooms receive the same projection of the superimposed silhouettes. When silhouettes from the two rooms overlap, the overlapped area turns red and a midi note is sent to an external audio program (Reason, in this case).

The video feeds are initialized using a background subtraction technique, so only new objects in the space (ie: people) are fed back as silhouettes. The program sends midi notes using the RWMidi library. The program contains four different modes, some in which participants hear the same sounds, some in which they hear different sounds. All notes are currently in the C major scale. 

My version sounds like this (all sounds can be modulated in the sound program you choose):

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Processing Code

Nov
1

For the 24h Thesis I also created live video filters with Processing. These are composite frames of filtered video. 

Processing Code: [1] [2]