Personalised pricing: The demise of the fixed price? external link

Abstract

An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.

algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy

Bibtex

Article{Poort2021, title = {Personalised pricing: The demise of the fixed price?}, author = {Poort, J. and Zuiderveen Borgesius, F.}, url = {https://www.ivir.nl/publicaties/download/The-Demise-of-the-Fixed-Price.pdf}, year = {0304}, date = {2021-03-04}, abstract = {An online seller or platform is technically able to offer every consumer a different price for the same product, based on information it has about the customers. Such online price discrimination exacerbates concerns regarding the fairness and morality of price discrimination, and the possible need for regulation. In this chapter, we discuss the underlying basis of price discrimination in economic theory, and its popular perception. Our surveys show that consumers are critical and suspicious of online price discrimination. A majority consider it unacceptable and unfair, and are in favour of a ban. When stores apply online price discrimination, most consumers think they should be informed about it. We argue that the General Data Protection Regulation (GDPR) applies to the most controversial forms of online price discrimination, and not only requires companies to disclose their use of price discrimination, but also requires companies to ask customers for their prior consent. Industry practice, however, does not show any adoption of these two principles.}, keywords = {algorithms, frontpage, GDPR, gegevensbescherming, Personalisation, Price discrimination, Privacy}, }

Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem external link

Philosophical Transactions of the Royal Society A, vol. 376, num: 2135, pp: 1-21, 2018

Abstract

The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people's news provision.

access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency

Bibtex

Article{Harambam2018b, title = {Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem}, author = {Harambam, J. and Helberger, N. and van Hoboken, J.}, url = {http://rsta.royalsocietypublishing.org/content/roypta/376/2133/20180088.full.pdf }, doi = {https://doi.org/http://dx.doi.org/10.1098/rsta.2018.0088}, year = {1123}, date = {2018-11-23}, journal = {Philosophical Transactions of the Royal Society A}, volume = {376}, number = {2135}, pages = {1-21}, abstract = {The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people\'s news provision.}, keywords = {access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency}, }

Shrinking core? Exploring the differential agenda setting power of traditional and personalized news external link

Möller, J., Helberger, N., Trilling, D., Irion, K. & Vreese, C.H. de
info, vol. 18, num: 6, pp: 26-41, 2016

Abstract

A shared issue agenda provides democracies with a set of topics that structure the public debate. The advent of personalized news media that use smart algorithms to tailor the news offer to the user challenges the established way of setting the agenda of such a common core of issues. This paper tests the effects of personalized news use on perceived importance of these issues in the common core. In particular we study whether personalized news use leads to a concentration at the top of the issue agenda or to a more diverse issue agenda with a long tail of topics. Based on a cross-sectional survey of a representative population sample (N=1556), we find that personalized news use does not lead to a small common core in which few topics are discussed extensively, yet there is a relationship between personalized news use and a preference for less discussed topics. This is a result of a specific user profile of personalized news users: younger, more educated news users are more interested in topics at the fringes of the common core and also make more use of personalized news offers. The results are discussed in the light of media diversity and recent advances in public sphere research.

common core, fragmentation, frontpage, Media law, media law & policy, Personalisation, survey

Bibtex

Article{Moeller2016, title = {Shrinking core? Exploring the differential agenda setting power of traditional and personalized news}, author = {Möller, J. and Helberger, N. and Trilling, D. and Irion, K. and Vreese, C.H. de}, url = {http://www.emeraldinsight.com/doi/pdfplus/10.1108/info-05-2016-0020}, doi = {https://doi.org/http://dx.doi.org/10.1108/info-05-2016-0020}, year = {0927}, date = {2016-09-27}, journal = {info}, volume = {18}, number = {6}, pages = {26-41}, abstract = {A shared issue agenda provides democracies with a set of topics that structure the public debate. The advent of personalized news media that use smart algorithms to tailor the news offer to the user challenges the established way of setting the agenda of such a common core of issues. This paper tests the effects of personalized news use on perceived importance of these issues in the common core. In particular we study whether personalized news use leads to a concentration at the top of the issue agenda or to a more diverse issue agenda with a long tail of topics. Based on a cross-sectional survey of a representative population sample (N=1556), we find that personalized news use does not lead to a small common core in which few topics are discussed extensively, yet there is a relationship between personalized news use and a preference for less discussed topics. This is a result of a specific user profile of personalized news users: younger, more educated news users are more interested in topics at the fringes of the common core and also make more use of personalized news offers. The results are discussed in the light of media diversity and recent advances in public sphere research.}, keywords = {common core, fragmentation, frontpage, Media law, media law & policy, Personalisation, survey}, }