A Primer and FAQ on Copyright Law and Generative AI for News Media external link

Quintais, J. & Diakopoulos, N.
2023

Artificial intelligence, Copyright, Media law, news

Bibtex

Online publication{nokey, title = {A Primer and FAQ on Copyright Law and Generative AI for News Media}, author = {Quintais, J. and Diakopoulos, N.}, url = {https://generative-ai-newsroom.com/a-primer-and-faq-on-copyright-law-and-generative-ai-for-news-media-f1349f514883}, year = {2023}, date = {2023-04-26}, keywords = {Artificial intelligence, Copyright, Media law, news}, }

Dealing with Opinion Power in the Platform World: Why We Really Have to Rethink Media Concentration Law external link

Seipp, T., Helberger, N., Vreese, C.H. de & Ausloos, J.
Digital Journalism, 2023

Abstract

The platformised news environment affects audiences, challenges the news media’s role, and transforms the media ecosystem. Digital platform companies influence opinion formation and hence wield “opinion power,” a normatively and constitutionally rooted notion that captures the core of media power in democracy and substantiates why that power must be distributed. Media concentration law is the traditional tool to prevent predominant opinion power from emerging but is, in its current form, not applicable to the platform context. We demonstrate how the nature of opinion power is changing and shifting from news media to platforms and distinguish three levels of opinion power: (1) the individual citizen, (2) the institutional newsroom and (3) the media ecosystem. The reconceptualization at the three levels provides a framework to develop future (non-)regulatory responses that address (1) the shifting influence over individual news consumption and exposure, (2) the changing power dynamics within automated, datafied and platform-dependent newsrooms, and (3) the systemic power of platforms and structural dependencies in the media ecosystem. We demonstrate that as the nature of opinion power is changing, so must the tools of control.

Media law, news, Platforms

Bibtex

Article{nokey, title = {Dealing with Opinion Power in the Platform World: Why We Really Have to Rethink Media Concentration Law}, author = {Seipp, T. and Helberger, N. and Vreese, C.H. de and Ausloos, J.}, url = {https://www.tandfonline.com/doi/full/10.1080/21670811.2022.2161924}, doi = {https://doi.org/10.1080/21670811.2022.2161924}, year = {2023}, date = {2023-01-03}, journal = {Digital Journalism}, abstract = {The platformised news environment affects audiences, challenges the news media’s role, and transforms the media ecosystem. Digital platform companies influence opinion formation and hence wield “opinion power,” a normatively and constitutionally rooted notion that captures the core of media power in democracy and substantiates why that power must be distributed. Media concentration law is the traditional tool to prevent predominant opinion power from emerging but is, in its current form, not applicable to the platform context. We demonstrate how the nature of opinion power is changing and shifting from news media to platforms and distinguish three levels of opinion power: (1) the individual citizen, (2) the institutional newsroom and (3) the media ecosystem. The reconceptualization at the three levels provides a framework to develop future (non-)regulatory responses that address (1) the shifting influence over individual news consumption and exposure, (2) the changing power dynamics within automated, datafied and platform-dependent newsrooms, and (3) the systemic power of platforms and structural dependencies in the media ecosystem. We demonstrate that as the nature of opinion power is changing, so must the tools of control.}, keywords = {Media law, news, Platforms}, }

Interested in diversity: The role of user attitudes, algorithmic feedback loops, and policy in news personalization external link

Digital Journalism, vol. 7, num: 2, pp: 206-229, 2019

Abstract

Using survey evidence from the Netherlands, we explore the factors that influence news readers’ attitudes toward news personalization. We show that the value of personalization depends on commonly overlooked factors, such as concerns about a shared news sphere, and the diversity of recommendations. However, these expectations are not universal. Younger, less educated users are more exposed to personalized news and show little concern about diverse news recommendations. Quality news organizations that pursue reader loyalty and trust are incentivized to implement personalization algorithms that aim for diversity and high quality recommendations. However, some users are in danger of being left out of this positive feedback loop. We make specific policy suggestions regarding how to solve that issue.

frontpage, Mediarecht, news, personalization, survey, the netherlands, user attitudes

Bibtex

Article{Bodó2019, title = {Interested in diversity: The role of user attitudes, algorithmic feedback loops, and policy in news personalization}, author = {Bodó, B. and Helberger, N. and Eskens, S. and Möller, J.}, url = {https://www.tandfonline.com/doi/full/10.1080/21670811.2018.1521292}, doi = {https://doi.org/10.1080/21670811.2018.1521292}, year = {0108}, date = {2019-01-08}, journal = {Digital Journalism}, volume = {7}, number = {2}, pages = {206-229}, abstract = {Using survey evidence from the Netherlands, we explore the factors that influence news readers’ attitudes toward news personalization. We show that the value of personalization depends on commonly overlooked factors, such as concerns about a shared news sphere, and the diversity of recommendations. However, these expectations are not universal. Younger, less educated users are more exposed to personalized news and show little concern about diverse news recommendations. Quality news organizations that pursue reader loyalty and trust are incentivized to implement personalization algorithms that aim for diversity and high quality recommendations. However, some users are in danger of being left out of this positive feedback loop. We make specific policy suggestions regarding how to solve that issue.}, keywords = {frontpage, Mediarecht, news, personalization, survey, the netherlands, user attitudes}, }

Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem external link

Philosophical Transactions of the Royal Society A, vol. 376, num: 2135, pp: 1-21, 2018

Abstract

The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people's news provision.

access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency

Bibtex

Article{Harambam2018b, title = {Democratizing algorithmic news recommenders: how to materialize voice in a technologically saturated media ecosystem}, author = {Harambam, J. and Helberger, N. and van Hoboken, J.}, url = {http://rsta.royalsocietypublishing.org/content/roypta/376/2133/20180088.full.pdf }, doi = {https://doi.org/http://dx.doi.org/10.1098/rsta.2018.0088}, year = {1123}, date = {2018-11-23}, journal = {Philosophical Transactions of the Royal Society A}, volume = {376}, number = {2135}, pages = {1-21}, abstract = {The deployment of various forms of AI, most notably of machine learning algorithms, radically transforms many domains of social life. In this paper we focus on the news industry, where different algorithms are used to customize news offerings to increasingly specific audience preferences. While this personalization of news enables media organizations to be more receptive to their audience, it can be questioned whether current deployments of algorithmic news recommenders (ANR) live up to their emancipatory promise. Like in various other domains, people have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes. Instead of going down the intricate avenue of trying to make ANR more transparent, we explore in this article ways to give people more influence over the information news recommendation algorithms provide by thinking about and enabling possibilities to express voice. After differentiating four ideal typical modalities of expressing voice (alternation, awareness, adjustment and obfuscation) which are illustrated with currently existing empirical examples, we present and argue for algorithmic recommender personae as a way for people to take more control over the algorithms that curate people\'s news provision.}, keywords = {access to information, algoritmes, Artificial intelligence, frontpage, news, persona, Personalisation, right to receive information, user agency}, }

Do not blame it on the algorithm: an empirical assessment of multiple recommender systems and their impact on content diversity external link

Möller, J., Trilling, D., Helberger, N. & Es, B. van
Information, Communication & Society, 2018

Abstract

In the debate about filter bubbles caused by algorithmic news recommendation, the conceptualization of the two core concepts in this debate, diversity and algorithms, has received little attention in social scientific research. This paper examines the effect of multiple recommender systems on different diversity dimensions. To this end, it maps different values that diversity can serve, and a respective set of criteria that characterizes a diverse information offer in this particular conception of diversity. We make use of a data set of simulated article recommendations based on actual content of one of the major Dutch broadsheet newspapers and its users (N=21,973 articles, N=500 users). We find that all of the recommendation logics under study proved to lead to a rather diverse set of recommendations that are on par with human editors and that basing recommendations on user histories can substantially increase topic diversity within a recommendation set.

algoritmes, automated content classification, diversity metrics, filter bubbles, frontpage, news, recommender systems

Bibtex

Article{Möller2018, title = {Do not blame it on the algorithm: an empirical assessment of multiple recommender systems and their impact on content diversity}, author = {Möller, J. and Trilling, D. and Helberger, N. and Es, B. van}, url = {https://www.ivir.nl/publicaties/download/ICS_2018.pdf}, doi = {https://doi.org/https://doi.org/10.1080/1369118X.2018.1444076}, year = {0308}, date = {2018-03-08}, journal = {Information, Communication & Society}, abstract = {In the debate about filter bubbles caused by algorithmic news recommendation, the conceptualization of the two core concepts in this debate, diversity and algorithms, has received little attention in social scientific research. This paper examines the effect of multiple recommender systems on different diversity dimensions. To this end, it maps different values that diversity can serve, and a respective set of criteria that characterizes a diverse information offer in this particular conception of diversity. We make use of a data set of simulated article recommendations based on actual content of one of the major Dutch broadsheet newspapers and its users (N=21,973 articles, N=500 users). We find that all of the recommendation logics under study proved to lead to a rather diverse set of recommendations that are on par with human editors and that basing recommendations on user histories can substantially increase topic diversity within a recommendation set.}, keywords = {algoritmes, automated content classification, diversity metrics, filter bubbles, frontpage, news, recommender systems}, }