Project Paperclip is a photographic exhibit that uses augmented reality.  The experience is unique each time it is activated as the app’s algorithm utilizes real-time processing of variables from the visitors location, such as: the time of day, the level of noise that exists in the room, your voice, the movement and localization of the user, amongst others.

The concept of Augmented Reality has it’s usually known, utilizes a digital interface to permit the creation of a bridge between our universe and the digital universe, creating a mixed ambience in real time where the differentiation between these two realities is reduced. In this exhibit, this is brought about by using an iPhone, headsets and the software that is available for free on the Apple app store.

To make full use of this exhibit, visitors that have an iPhone 3 or above need to download Project Paperclip application, at the Apple AppStore. When equipped with headphones (the better the quality, the more immersive the simulation will be), switch on the app and follow instructions to activate the reactive soundscapes. The process is simple, point the iPhone’s camera to the photograph’s QR Code and after it has been scanned, you will unlock the soundscape created for the photograph in question.

Concept & photography by Nuno Serrão
Development and augmented sound by Yuli Levtov and Ragnar Hrafnkelsson (Reactify Music)
Soundscapes by Alexandre Gonçalvesand Nuno Serrão