Archive for the 'video' Category

Jan
28

Raw footage of OPD shooting tear gas, bean bag rounds (green flashes) and flash bang grenades at Occupy Oakland protestors on the January 28th move-in day:

And some photos of the debris left over after the street cleared out:

The green thing is a bean bag.

Oct
24

untitled
video
Presented at the Santa Cruz Museum of Art and History

The animation and soundtrack are algorithmically generated, mostly composed from trigonometry functions. Both are created from a single program and run in real time at 60 fps. No random functions are used.

Created using OpenGL in Processing and PureData.

May
26

The UC State of the Arts just posted a video of our still building installation from the University of California Institute for Research in the Arts’ “Future Tense” conference last fall:

Feb
23

video installation
collaboration with Ted Passon
Instructions: Come inside. Kneel. Light a candle. Say a Prayer. Place your lit candle in the prayer cupboard.
Matinee, St. Cecilia’s Gallery, Brooklyn, NY, February 5-6, 2011

Reviewed on artnet.com

More video and photo documentation coming soon.

Feb
21

Jan
17

video

Sep
19

photo by Kally Kahn

photo by Kally Kahn

Here are some photos of people playing my video game at the art.tech exhibition at the Lab in San Francisco. And below is the Processing code for the game. It’s currently configured to run on a computer with an internal video camera and an external video camera attached via firewire. I have tested it using an iSight camera and a DV camera, both work great. I recommend running it with two mirrored screens set at 640×480. You will also need to add sounds to the Processing data folder named “good.wav” and “bad.wav”–classic video games sounds work well. See this post for instructions on playing. Have fun!

Processing Code

photo by Kally Kahn

photo by Kally Kahn

photo by Kally Kahn

photo by Kally Kahn

Jul
22

This is a video of two people playing a video game I made using Processing last May. Each player sees a silhouette of themselves on screen whenever they move (their silhouette disappears when they are still). The player also sees a silhouette of the other player. The object of the game is to collect the blue balls while avoiding the red balls to achieve a high score. Balls can only be collected when both players’ silhouettes overlap each other and a ball, so the two players must work together. Each blue ball collected is worth one point while each red ball is worth one negative point.

May
8

Grey/Green
The first 35 minutes of Grey Gardens arranged by green.

Grey/Blue
The first 35 minutes of Grey Gardens arranged by blue.
 

May
8

Jan
28

daotw

I projected some live reactive video filters for the Dragging an Ox through Water show last week. Above is a photo of the “Bubble Filter” in action. 

visuals

Above are stills from four of the filter modes. Modes can be selected by pressing the 1-5 keys or the “b” and “w” keys. The modes are as follows:

1: Unfiltered video
2: Black and White Bubble Filter
3: Colored Bubble Filter
4: Grey/Red Frame Difference Filter (based on the “Frame Differencing” sketch by Golan Levin)
5: Grey/Red Frame Difference Filter w/noise (based on the “Frame Differencing” sketch by Golan Levin)
B: Black screen
W: White screen

If you have a built in video camera on your computer, you can load up the sketch in Processing and start playing with live filtered video. 

Processing Code

Jan
10

daotw

I’m putting together this show and it’s going to be amazing! Brian of Dragging an Ox is one of my favorite musicians and he puts on a great show. I’ll be putting together a reactive video piece for the show as part of the DANM exhibition.  

Continue reading »

Dec
4

In this program, each bubble has a note (in C major) associated with it. When a silhouette in the video feed overlaps with the bubble, the note is sent to an audio program (Reason, in this case) in the form of a midi note. The bubbles can be moved left and right using an Arduino equipped with an accelerometer.

Processing Code

Similar to the above, but now two video feeds are incorporated. Now bubbles can only be played if they overlap with red sections of the video. Red sections are created when two silhouettes from separate video feeds overlap, so cooperation between people in the two video feeds is necessary to create notes. 

Processing Code

Both these programs build on the programs from my previous two posts. 

Dec
3

In this project, people in two separate rooms are videotaped. The two video feeds are fed through Processing, converted into silhouettes and superimposed over each other. Both rooms receive the same projection of the superimposed silhouettes. When silhouettes from the two rooms overlap, the overlapped area turns red and a midi note is sent to an external audio program (Reason, in this case).

The video feeds are initialized using a background subtraction technique, so only new objects in the space (ie: people) are fed back as silhouettes. The program sends midi notes using the RWMidi library. The program contains four different modes, some in which participants hear the same sounds, some in which they hear different sounds. All notes are currently in the C major scale. 

My version sounds like this (all sounds can be modulated in the sound program you choose):

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Processing Code

Nov
1

For the 24h Thesis I also created live video filters with Processing. These are composite frames of filtered video. 

Processing Code: [1] [2]