Every second thousands of multimedia elements are generated for the purpose of sharing life experiences and feelings with friends, and voicing personal opinions on trending topics. These data are traditionally explored through a visual approach, involving the use of pictures, videos and charts. In recent years, sonification has proved some potential for uncovering new insights on data representation. TwitterRadio explores the role of music as a medium for representing information. The idea behind this project is to access the musical domain to display information about the latest trends and news. The system automatically generates tonal music compositions that are supposed to match the emotional contents of the tweets, as well as their frequency. By operating a tuning knob, the users can tune the TwitterRadio on pre-set hashtags or add new themes they are interested in by typing the hashtags. The TwitterRadio collects all recent associated tweets and retrieves information about their emotional valence (positive vs. negative), their frequency and re-tweet frequency. These features are then mapped into music that match the detected mood and intensity. A hashtag with only a few mentions will result in a slow, flat melody, while a trending hashtag will result in intense sounds.
Role: ideation, conceptual design, UX design, algorithmic composition, research
Collaborators: Aliaksei Miniukovich, Andrea Conci, Antonella De Angeli