New journal article investigating algorithmic recommendation
New journal article published at the Journal of Royal Society New Zealand (Taylor & Francis). My co-authors, Matt Bartlett and Gauri Prabhakar, and I investigated a method to bypass digital platforms’ unwillingness to share information on their algorithmic recommendations by analysing the evolution of their Privacy Policies and Terms of Use documents.
Analysing Privacy Policies and Terms of Use to understand algorithmic recommendations: the case studies of Tinder and Spotify
The algorithmic recommendations used by digital platforms have significant impacts on users behaviours and preferences. For instance, Spotify and Tinder ground their platforms on recommendation algorithms that nudge users to either listen to specific songs or romantically match with specific users. Despite their powerful influence, there is little concrete detail about how exactly these algorithms work as technology companies are increasingly resistant to scholarly scrutiny. This article makes both methodological and substantive contributions to understanding how these influential recommendation algorithms work. We conducted a sequential analysis of historical and contemporary iterations of Spotify and Tinder’s Privacy Policies and Terms of Use to ascertain the extent to which it is possible to use this sort of analysis to infer functionalities of the algorithmic recommendations. Our results offered certain insights into the companies, such as Spotify acknowledging in its Privacy Policy that companie’s commercial agreements may be altering the recommendations. However, the legal documentation of both companies is ambiguous and lacks detail as to the platform’s use of AI and user data. This opaque drafting of Privacy Policies and Terms of Use hamper the capacity of outsiders to properly scrutinise the companies’ algorithms and their relationship with users.