Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem external link

Philosophical Transactions of the Royal Society A, vol. 376, num: 2135, pp: 1-21, 2018

Abstract

The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people's news provision.

access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency

Bibtex

Article{Harambam2018b, title = {Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem}, author = {Harambam, J. and Helberger, N. and van Hoboken, J.}, url = {http://rsta.royalsocietypublishing.org/content/roypta/376/2133/20180088.full.pdf }, doi = {https://doi.org/http://dx.doi.org/10.1098/rsta.2018.0088}, year = {1123}, date = {2018-11-23}, journal = {Philosophical Transactions of the Royal Society A}, volume = {376}, number = {2135}, pages = {1-21}, abstract = {The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people\'s news provision.}, keywords = {access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency}, }