Moderating online child sexual abuse material (CSAM): Does self-regulation work, or is greater state regulation needed?
Bleakley, Paul ORCID: https://orcid.org/0000-0002-2512-4072, Martellozzo, Elena
ORCID: https://orcid.org/0000-0002-1249-7611, Spence, Ruth
ORCID: https://orcid.org/0000-0002-6197-9975 and DeMarco, Jeffrey
ORCID: https://orcid.org/0000-0002-7160-2100
(2023)
Moderating online child sexual abuse material (CSAM): Does self-regulation work, or is greater state regulation needed?
European Journal of Criminology
.
ISSN 1477-3708
[Article]
(Accepted/In press)
|
PDF
- Final accepted version (with author's formatting)
Download (334kB) | Preview |
Abstract
Social media platforms serve a role in the Internet age as crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, the role of content moderators (CMs) employed to safeguard users against harmful content like child sexual abuse material and gore is critical — however, despite how essential CMs are to the social media landscape, their work as “first responders” is complicated by legal and systemic debates over whether policing cyberspace should be left to the self-regulation of tech companies, or if greater state-regulation is required. In this scoping review, major debates in this area are identified and evaluated. This includes the issue of territorial jurisdiction, and how it obstructs traditional policing online; concerns over free speech and privacy if CMs are given greater powers; debates over whether tech companies should be legally liable for user-generated content and; the impacts (mental and professional) on the very CMs now operating as the new frontline against harmful, often traumatic, materials shared on social media. In outlining these issues, our objective is to highlight issues requiring further attention in order to best support CMs, and to enhance responses to harmful online content.
Item Type: | Article |
---|---|
Sustainable Development Goals: | |
Theme: | |
Keywords (uncontrolled): | content moderation; social media; online harms; child sexual abuse material; cybercrime; technology industry |
Research Areas: | A. > School of Science and Technology > Psychology > Centre for Abuse and Trauma Studies (CATS) |
Item ID: | 38006 |
Notes on copyright: | The author accepted manuscript is made available in this repository in accordance with the publisher's (SAGE) self-archiving policy for an Institutional Repository https://uk.sagepub.com/en-gb/eur/posting-to-an-institutional-repository-green-open-access |
Useful Links: | |
Depositing User: | Ruth Spence |
Date Deposited: | 24 May 2023 16:03 |
Last Modified: | 15 Jun 2023 12:36 |
URI: | https://eprints.mdx.ac.uk/id/eprint/38006 |
Actions (login required)
![]() |
View Item |
Statistics
Additional statistics are available via IRStats2.