EAREYEMOUTH
This installation was a collaborative work done by myself and a colleague of mine Simon Kenny. It was exhibited at various conferences as well as Galway Culture night 2014. We later adapted this installation for use with the Galway Autism Project and were awarded the Google STEM prize at Creative Tech fest 2014. Below is the about section that I pulled from the project’s website, eareyemouth.com. There’s a ton media and more information on this site.
This installation is a mediator between participants and technology in a social space. It is an experiment in the potential effect of human presence in music composition and immersive sonic experiences. The reason for the choice of our physical design, i.e. the use of human facial features as dressings for the objects, is to actively encourage modes of interaction that are associated with the particular body parts. This is intended to encourage playful engagement with the piece.
Participants are encouraged to speak into the ear and move their hands above the eye. An audio signal is transmitted from the ear to the eye sculpture to be processed by a hidden computer running the Csound software package. Within the pupil of the eye, a Leap Motion Controller is embedded. The Leap Motion Controller is a computer vision tool which tracks both hands, including the fingers, with a great degree of accuracy. This information is interpreted by Csound which in turn manipulates the incoming audio signal. The interpretation is facilitated by custom Csound plugin software written by a member of our team.
We use spectral processing tools, such as a phase vocoder, to work with the audio signal in the frequency domain. This approach allows for both intricate and drastic transformations and yields interesting results. The processed signal is sent to the four speakers which compose our quadraphonic sound system, which are housed within sculptures of mouths.