TwitterRadio (2014)

Every second thousands of multimedia elements are generated for the purpose of sharing life experiences and feelings with friends, and voicing personal opinions on trending topics. These data are traditionally explored through a visual approach, involving the use of pictures, videos and charts. In recent years, sonification has proved some potential for uncovering new insights on data representation. TwitterRadio explores the role of music as a medium for representing information. The idea behind this project is to access the musical domain to display information about the latest trends and news. The system automatically generates tonal music compositions that are supposed to match the emotional contents of the tweets, as well as their frequency. By operating a tuning knob, the users can tune the TwitterRadio on pre-set hashtags or add new themes they are interested in by typing the hashtags. The TwitterRadio collects all recent associated tweets and retrieves information about their emotional valence (positive vs. negative), their frequency and re-tweet frequency. These features are then mapped into music that match the detected mood and intensity. A hashtag with only a few mentions will result in a slow, flat melody, while a trending hashtag will result in intense sounds.

Year: 2013-14

Role: ideation, conceptual design, UX design, algorithmic composition, research

Collaborators: Aliaksei Miniukovich, Andrea Conci, Antonella De Angeli

Related publications

Exhibitions:

  • 22 August 2014, MART (Museum of Contemporary Art), Trento (Italy).
  • 26 April 2014, CHI 2014 Interactivity, Toronto (Canada).
  • 19 September 2013, CHItaly 2013 Interactive Experiences, Trento (Italy).