Created in collaboration with Daniel Jones, The Listening Machine observed the interactions of 500 UK-based Twitter users in real time, translating their words, sentiments and social behaviours into a 9-month-long piece of music. Commissioned by the BBC/Arts Council England as part of on-demand arts channel The Space, it combines natural language processing and algorithmic composition with a vast array of orchestral fragments recorded with the Britten Sinfonia

Many different elements made up the compositional process: sentiment analysis governed the mood of the piece; prosody analysis generated individual melody lines based on subjects' syllables and rhythms of speech;  topic detection triggered preset modular segments and field recordings corresponding to given areas of conversation. The piece was streamed live on between May 2012 and January 2013.

The visual design was created by Joe Hales, whose textbook-like aesthetic has translated perfectly to the screen. Implemented in HTML5 with liquid layouts to scale for various kinds of devices, the viewer can see a live visualisation of the current system state, with a cross-platform interface that is as happy on an iPhone as it is on a desktop.


Twitter StoriesWall Street JournalWiredHuffington PostScientific AmericanEl Pais and many more...