Online content moderation and the Dark Web: Policy responses to radicalizing hate speech and malicious content on the Darknet

Eric Jardine


De-listing, de-platforming, and account bans are just some of the increasingly common steps taken by major Internet companies to moderate their online content environments. Yet these steps are not without their unintended effects. This paper proposes a surface-to-Dark Web content cycle. In this process, malicious content is initially posted on the surface Web. It is then moderated by platforms. Moderated content does not necessarily disappear when major Internet platforms crackdown, but simply shifts to the Dark Web. From the Dark Web, malicious informational content can then percolate back to the surface Web through a series of three pathways. The implication of this cycle is that managing the online information environment requires careful attention to the whole system, not just content hosted on surface Web platforms per se. Both government and private sector actors can more effectively manage the surface-to-Dark Web content cycle through a series of discrete practices and policies implemented at each stage of the wider process.


Dark Web; Darknet; Radicalization; Content Moderation

Full Text:



A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2020. ISSN 1396-0466.