CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the provision agree on how it should work external link

Kluwer Copyright Blog, 2020

Art. 17 CDSM Directive, Auteursrecht, filtering, frontpage

Bibtex

Article{Keller2020d, title = {CJEU hearing in the Polish challenge to Article 17: Not even the supporters of the provision agree on how it should work}, author = {Keller, P.}, url = {http://copyrightblog.kluweriplaw.com/2020/11/11/cjeu-hearing-in-the-polish-challenge-to-article-17-not-even-the-supporters-of-the-provision-agree-on-how-it-should-work/}, year = {1111}, date = {2020-11-11}, journal = {Kluwer Copyright Blog}, keywords = {Art. 17 CDSM Directive, Auteursrecht, filtering, frontpage}, }

The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market external link

Abstract

EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of 'double specificity' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive ('ECD') and Article 17(8) of the Directive on Copyright in the Digital Single Market ('CDSMD'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.

algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content

Bibtex

Report{Senftleben2020e, title = {The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market}, author = {Senftleben, M. and Angelopoulos, C.}, url = {https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3717022}, year = {1029}, date = {2020-10-29}, abstract = {EU law provides explicitly that intermediaries may not be obliged to monitor their service in a general manner in order to detect and prevent the illegal activity of their users. However, a misunderstanding of the difference between monitoring specific content and monitoring FOR specific content is a recurrent theme in the debate on intermediary liability and a central driver of the controversy surrounding it. Rightly understood, a prohibited general monitoring obligation arises whenever content – no matter how specifically it is defined – must be identified among the totality of the content on a platform. The moment platform content must be screened in its entirety, the monitoring obligation acquires an excessive, general nature. Against this background, a content moderation duty can only be deemed permissible if it is specific in respect of both the protected subject matter and potential infringers. This requirement of \'double specificity\' is of particular importance because it prevents encroachments upon fundamental rights. The jurisprudence of the Court of Justice of the European Union has shed light on the anchorage of the general monitoring ban in primary EU law, in particular the right to the protection of personal data, the freedom of expression and information, the freedom to conduct a business, and the free movement of goods and services in the internal market. Due to their higher rank in the norm hierarchy, these legal guarantees constitute common ground for the application of the general monitoring prohibition in secondary EU legislation, namely Article 15(1) of the E-Commerce Directive (\'ECD\') and Article 17(8) of the Directive on Copyright in the Digital Single Market (\'CDSMD\'). With regard to the Digital Services Act (‘DSA’), this result of the analysis implies that any further manifestation of the general monitoring ban in the DSA would have to be construed and applied – in the light of applicable CJEU case law – as a safeguard against encroachments upon the aforementioned fundamental rights and freedoms. If the final text of the DSA does not contain a reiteration of the prohibition of general monitoring obligations known from Article 15(1) ECD and Article 17(8) CDSMD, the regulation of internet service provider liability, duties of care and injunctions would still have to avoid inroads into the aforementioned fundamental rights and freedoms and observe the principle of proportionality. The double specificity requirement plays a central role in this respect.}, keywords = {algorithmic enforcement, Auteursrecht, censorship, Content moderation, Copyright, defamation, Digital services act, filtering, Freedom of expression, frontpage, general monitoring, hosting service, injunctive relief, intermediary liability, notice and stay down, notice and take down, safe harbour, trade mark, user-generated content}, }

My Friends, Editors, Algorithms, and I: Examining audience attitudes to news selection external link

Thurman, N., Möller, J., Helberger, N. & Trilling, D.
Digital Journalism, vol. 2018, 2018

Abstract

Prompted by the ongoing development of content personalization by social networks and mainstream news brands, and recent debates about balancing algorithmic and editorial selection, this study explores what audiences think about news selection mechanisms and why. Analysing data from a 26-country survey (N = 53,314), we report the extent to which audiences believe story selection by editors and story selection by algorithms are good ways to get news online and, using multi-level models, explore the relationships that exist between individuals’ characteristics and those beliefs. The results show that, collectively, audiences believe algorithmic selection guided by a user’s past consumption behaviour is a better way to get news than editorial curation. There are, however, significant variations in these beliefs at the individual level. Age, trust in news, concerns about privacy, mobile news access, paying for news, and six other variables had effects. Our results are partly in line with current general theory on algorithmic appreciation, but diverge in our findings on the relative appreciation of algorithms and experts, and in how the appreciation of algorithms can differ according to the data that drive them. We believe this divergence is partly due to our study’s focus on news, showing algorithmic appreciation has context-specific characteristics.

algoritmes, curation, filtering, frontpage, gatekeeping, Journalistiek, Mediarecht, personalization, recommender systems, user tracking

Bibtex

Article{Thurman2018, title = {My Friends, Editors, Algorithms, and I: Examining audience attitudes to news selection}, author = {Thurman, N. and Möller, J. and Helberger, N. and Trilling, D.}, url = {https://doi.org/10.1080/21670811.2018.1493936}, year = {1019}, date = {2018-10-19}, journal = {Digital Journalism}, volume = {2018}, pages = {}, abstract = {Prompted by the ongoing development of content personalization by social networks and mainstream news brands, and recent debates about balancing algorithmic and editorial selection, this study explores what audiences think about news selection mechanisms and why. Analysing data from a 26-country survey (N = 53,314), we report the extent to which audiences believe story selection by editors and story selection by algorithms are good ways to get news online and, using multi-level models, explore the relationships that exist between individuals’ characteristics and those beliefs. The results show that, collectively, audiences believe algorithmic selection guided by a user’s past consumption behaviour is a better way to get news than editorial curation. There are, however, significant variations in these beliefs at the individual level. Age, trust in news, concerns about privacy, mobile news access, paying for news, and six other variables had effects. Our results are partly in line with current general theory on algorithmic appreciation, but diverge in our findings on the relative appreciation of algorithms and experts, and in how the appreciation of algorithms can differ according to the data that drive them. We believe this divergence is partly due to our study’s focus on news, showing algorithmic appreciation has context-specific characteristics.}, keywords = {algoritmes, curation, filtering, frontpage, gatekeeping, Journalistiek, Mediarecht, personalization, recommender systems, user tracking}, }

Exposure diversity as a design principle for recommender systems external link

Helberger, N., Karppinen, K. & D'Acunto, L.
Information, Communication and Society, vol. 2018, num: 2, 2017

Abstract

Personalized recommendations in search engines, social media and also in more traditional media increasingly raise concerns over potentially negative consequences for diversity and the quality of public discourse. The algorithmic filtering and adaption of online content to personal preferences and interests is often associated with a decrease in the diversity of information to which users are exposed. Notwithstanding the question of whether these claims are correct or not, this article discusses whether and how recommendations can also be designed to stimulate more diverse exposure to information and to break potential ‘filter bubbles’ rather than create them. Combining insights from democratic theory, computer science and law, the article makes suggestions for design principles and explores the potential and possible limits of ‘diversity sensitive design’.

autonomy, exposure diversity, filter bubbles, filtering, frontpage, information diversity, medial law, nudging, recommender systems, search enginges, Social media

Bibtex

Article{Helberger2017, title = {Exposure diversity as a design principle for recommender systems}, author = {Helberger, N. and Karppinen, K. and D\'Acunto, L.}, url = {https://www.ivir.nl/publicaties/download/ICS_2016.pdf}, doi = {https://doi.org/10.1080/1369118X.2016.1271900}, year = {0119}, date = {2017-01-19}, journal = {Information, Communication and Society}, volume = {2018}, number = {2}, pages = {}, abstract = {Personalized recommendations in search engines, social media and also in more traditional media increasingly raise concerns over potentially negative consequences for diversity and the quality of public discourse. The algorithmic filtering and adaption of online content to personal preferences and interests is often associated with a decrease in the diversity of information to which users are exposed. Notwithstanding the question of whether these claims are correct or not, this article discusses whether and how recommendations can also be designed to stimulate more diverse exposure to information and to break potential ‘filter bubbles’ rather than create them. Combining insights from democratic theory, computer science and law, the article makes suggestions for design principles and explores the potential and possible limits of ‘diversity sensitive design’.}, keywords = {autonomy, exposure diversity, filter bubbles, filtering, frontpage, information diversity, medial law, nudging, recommender systems, search enginges, Social media}, }