Ashitaka:
An Audiovisual Instrument

This is a small site about my PhD.

The aim of my PhD was to investigate the ways in which audio and visuals may be linked in order to create a kind of audiovisual instrument - a musical instrument which would not just emit sound, but light and colour too.

To do this, I settled upon Michel Chion's notion of Synchresis, describing the way in which foley artists link unrelated sounds and images in film. An example being that of the filmed punch. In reality, punches rarely make much sound, regardless of how much pain is inflicted. Yet in film we are accustomed to hearing an exaggerated 'thwack' or 'thump'-type sound when we see someone being punched. The fact that our brains therefore create a connection between these unrelated stimuli, suggested to me that this phenomenon could be harnessed in the creation of an instrument.

A large part of my thesis therefore revolves around my attempt to determine how synchresis works, and how it could be put to use. I came to the conclusion that synchresis relies on motion - it is the fact that the motion we see (e.g. a fist colliding with someone) is related in some way to the motion we hear (a sound with a sharp attack, synchronised with the point of collision), that convinces our brains to perceive them as a single, fused audiovisual event.

To put this theory into practice, I developed a software environment called Heilan, and an audiovisual instrument called Ashitaka. The software may be downloaded from here.