Paletine.Frag is an exhibition of augmented photography. The object of the photos is the city of Nihilin, Palestine, its people and its daily practice. Through a self-made device, the user generates a real-time audio landscape that accompanies him during the exhibition. The soundscape is constructed dynamically from individual audio tracks associated with each photo through a filter of granular synthesis. The original sound is deconstructed to create a unique interactive sound track.
The photo exhibition was born from the experience of the author in Palestine during the month of November 2009. The author with other members of the Italian computer hacklab bugslab has achieved a project of grassroots cooperation with the Popular Resistance Committee of Nihilin. Bugslab is a technology activist collective and in the last 10 years it mades different political activities in the city of Rome, Italy, to promote the conscious use of information technology, free software in general and the sharing of knowledge. From April 2009 it initiated a process of co-op with the Committee for Popular Resistance of Nihil, a city 30 km south of Ramallah, near the separation wall built by Israel Goverment in 2005. Since its first year of constructions a Popular Resistance Committee in Bihilin and subsequently in Nihilin and other West Bank towns were born to resist to the practice of apartheid placed by the Israelian government. Every Friday held a demonstration against the occupation shared by both local activists, women, men and children as by international as Israelis, Europeans and Americans in defense of the self-determination's right of the Palestinian people. To support the process of the Palestinian struggle Bugslab, by mutual agreement with the People's Committee, has initiated a process of information technology grassroots co-op. During the month of November 2009 members of the team helped the community in building a Nihilin's medialab with 9 computers for use to achieve real-time news from the front. Simultaneously has been developed a website, www.nilin-palestine.org, organ of expression of the popular Committee, and a long trainig to teach participants how to use it, how to write news and how to capture a wild visibility in internet was made.
Unlike other similar projects, our intention was not to make a tecjnical tool for functional operations to use as “experts” but rather to share self-managed technology practice with the goal to establish a technical/journalism group that was autonomous from any external source (such as Bugslab itself). The action of grassroots co-op has been totally self-funded by the collective, no institutional financial aid has been requested to keep as much as possible the autonomous nature of the initiative.
The photographs portray moments of daily life Nihilin. Its people and its uses. The exhibition consists of about 15 photographs. Is divided into three stages:
The town and the people who inhabit it
The computer lab and computer course
The weekly friday demonstration, in particular the one coincided with the twentieth anniversary of the fall of the Berlin Wall.
The sound part is made with Puredata software. Through reactivision visual markers we assign a specific sound track to each photo. Each time a picture is recognized by the device this sound track passes through a filter of granular synthesis and it will be played back directly through the headphones of the visitor. The original sound is recorded in the same place where the photo was taken and is closely related to the context represented visually. The granular synthesis filter decomposes the original sound to reconstruct it in real time, according to the movements of the viewer; in that way we generate a sound landscape that even if it loses the link with the direct object of the photo, it acquires depth and evocative composition.
Some of the original field recordings below:
Arriving to Nilin
Arriving to Nilin2
Going to Friday Demostration
The device that allows the visitor to create his/her sound landscape in real time is home-made and it is the sum of different materials used at the same time. It is a safety helmet that has been fitted with a wireless camera in the front, an FM receiver in the rear, and a pair of eadphones in the lateral. The camera sends the signal to a maincomputer which, after process it, reproduce the sound and send it back to the helmet via a common FM trasmissor connected to the computer. The total value of materials used in its construction is about 50 €. During the research that led us to build this device, we considered it essential to adopt low-cost and easy to reproduce technologies by anyone in any context. For the kind of technology we implemented it's possible to use only one device at time.
The software that manages the entire process is composed of two parts: a visual markers reader and a granular synthesis software. Reactivision system is used for recognize markers, as it is embedded in the programming language Puredata: through wireless camera software recognizes the presence of the marker in the picture and plays the audio track back associated with it. The granular synthesis software, also made with Puredata, generates the synthesis from the original track and sends it to our sound2vision's helmet through a FM emisor changing the audio parameters in real time depending on the viewed picture.