Favale, M., 2022. Robots as the new judges: Copyright, Hate Speech, and Platforms. European Intellectual Property Review, 44 (8), 461-471.
Full text available as:
|
PDF
Robots as The New Judges-EIPR.pdf - Accepted Version Available under License Creative Commons Attribution Non-commercial. 443kB | |
Copyright to original material in this document is with the original owner(s). Access to this content through BURO is granted on condition that you use it only for research, scholarly or other non-commercial purposes. If you wish to use it for any other purposes, you must contact BU via BURO@bournemouth.ac.uk. Any third party copyright material in this document remains the property of its respective owner(s). BU grants no licence for further use of that third party material. |
Official URL: https://uk.westlaw.com/Document/ID141D8E0062A11EDB...
Abstract
On the 16th of October 2020, a middle-school teacher, Samuel Paty, was beheaded by a terrorist who would not know of his existence if not for a number of videos posted on social media, against which Mr Paty had filed for defamation with the local police. 1 Yet, a law against publishing heinous content online was approved in France on the 13th of May. 2 But in June, the Constitutional Council had repealed the article requiring to take down within 24 hours the incriminated content on the basis that it would trump freedom of expression. 3 Heated political debate has sparked on this decision in the light of the recent gruesome event. The topic of the liability of internet intermediaries has never been so contentious. Internet platforms have enjoyed immunity (known as Safe Harbour) both in European Union (EU) law and overseas. More recently (2019), a new Copyright Directive 4 entered into force. It was implemented by Member States in June 2021. This piece of legislation prompted criticism because it requires enhanced responsibility for internet platforms that do not remove quickly enough illegal content from their social media. 5 But how quick must an action be to be done "quickly enough" (e.g. expeditiously)? ISPs (Internet Service Providers) argue that by being "mere" intermediaries, they could not control the content that their subscribers were publishing online, and therefore could not be responsible for the unlawful actions taking place on their platforms. Rights holders argued in response that intermediaries would often benefit from infringing activities. Hence, their provision of services could not be considered totally neutral and therefore intermediaries should be held accountable. Currently, two new pieces of legislation are under way to horizontally streamline platforms’ filtering duties (the Digital Services Act and the Digital Markets Act). 6 However, a lot needs to be done to define the contours of these new norms, notably about different types of illegal content and whether they deserve different treatment. This paper discusses filtering obligation (Robots as opposed to judges) on copyright infringement v defamation/hate speech. It argues that it is not legally viable to implement the same norms on such different areas of law as the consequences of these norms’ infringement are incomparable.
Item Type: | Article |
---|---|
ISSN: | 0142-0461 |
Uncontrolled Keywords: | Accountability; Algorithms; Copyright; Defamation; EU law; Hate speech; Online infringement; Online intermediaries |
Group: | Faculty of Media & Communication |
ID Code: | 37685 |
Deposited By: | Symplectic RT2 |
Deposited On: | 20 Oct 2022 14:32 |
Last Modified: | 25 Jul 2023 01:08 |
Downloads
Downloads per month over past year
Repository Staff Only - |