|Commission:||Commissioned by Folly and the Abandon Normal Devices Festival||Exhibition:||Abandon Normal Devices Festival of New Cinema and Digital Culture|
Cumbria, UK: 15 March – 10 April 2010
Manchester, UK: 1 October – 7 October 2010
|Materials:||Facebook application, computers||Description:||Today, too, I experienced something I hope to understand in a few days comprises three elements, all of which engage with different forms of (self-) surveillance. The first element is a series of video portraits of volunteers, shot using poses and actions loosely based on Danish filmmaker Jorgen Leth’s 1967 film The Perfect Human. The work’s title comes from a line in the film. The videos are uploaded to a database where a program automatically edits them in the style of Leth’s film, their actions becoming jerky and strangely mechanical.
The second element of the work uses text from status profiles acquired from Facebook users who voluntarily signed up on Facebook to participate in the project. A software application automatically finds narrative relationships between the status updates and matches the Facebook profiles with the demographic of the video portraits, the Facebook profile texts serving as subtitles to the portraits. The final component of the project involves software that searches YouTube for videos that link to keywords in the subtitles, the final works being shown on YouTube as split-screen videos, the video portrait on one screen, its paired YouTube video on the other.
This is the first art project to be made ‘inside’ Facebook, and takes advantage of the controversial privacy settings available to Facebook developers. It also uses story detection software designed to discover narrative in large volumes of blog entries (developed at USC by Reid Swanson). YouTube, as a large, self-tagged database of human activities, is an ideal source of contextual data to accompany the Facebook narratives. The project runs as a Facebook application: all people who join the application will be participating in the art work. Data from their profile — age, gender and status updates — will be analyzed for narrative content, and potentially used to generate a short film. In exchange, each video that is generated is posted to participants’ Facebook pages, locating their own individual ‘story’ within a much larger set of human concerns.
YouTube Channel: http://www.youtube.com/ihopetounderstand
Project Overview http://www.youtube.com/watch?v=3QrnGkbZZk0
Sample video http://www.youtube.com/watch?v=9VH1_qadJ-M
|Credits:||With thanks to DXARTS, James Hughes, Reid Swanson, Noel Paul and Martin Jarmick.||Technical:||Facebook application built using Python, ffmpeg, MySQL and Facebook API. Narrative algorithm built in Java by Reid Swanson.||Reviews/News:||Art Monthly Review br>
Searching for the Real in Automated Self-Presentation: James Coupe’s Today, too, I experienced something I hope to understand in a few days (abstract) br>
Imperica: Being Watched
Screening at Henry Art Gallery
Entry in Art and Electronic Media