Element 68Element 45Element 44Element 63Element 64Element 43Element 41Element 46Element 47Element 69Element 76Element 62Element 61Element 81Element 82Element 50Element 52Element 79Element 79Element 7Element 8Element 73Element 74Element 17Element 16Element 75Element 13Element 12Element 14Element 15Element 31Element 32Element 59Element 58Element 71Element 70Element 88Element 88Element 56Element 57Element 54Element 55Element 18Element 20Element 23Element 65Element 21Element 22iconsiconsElement 83iconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsiconsElement 84iconsiconsElement 36Element 35Element 1Element 27Element 28Element 30Element 29Element 24Element 25Element 2Element 1Element 66
Reading Between the Lines and the Numbers: An Analysis of the First NetzDG Reports

Reading Between the Lines and the Numbers: An Analysis of the First NetzDG Reports

Amélie Pia Heldt's article "Reading Between the Lines and the Numbers: An Analysis of the First NetzDG Reports" in Internet Policy Review examines the activity reports that platforms, which receive more than 100 complaints a year, must publish under the NetzDG. The article is available online here.

Abstract

Approaches to regulating social media platforms and the way they moderate content has been an ongoing debate within legal and social scholarship for some time now. European policy makers have been asking for faster and more effective responses from the various social media platforms to explain how they might deal with the dissemination of hate speech and disinformation. After a failed attempt to push social media platforms to self-regulate, Germany adopted a law called the Network Enforcement Act (NetzDG) which forces platforms to ensure that “obviously unlawful content” is deleted within 24 hours. It contains an obligation that all platforms that receive more than 100 complaints per calendar year about unlawful content must publish bi-annual reports on their activities. This provision is designed to provide clarification on the way content is moderated and complaints handled on social networks. After the NetzDG came into force, initial reports reveal the law’s weak points, predominantly in reference to their low informative value. When it comes to important takeaways regarding new regulation against hate speech and more channelled content moderation, the reports do not live up to the expectations of German lawmakers. This paper analyses the legislative reasoning behind the reporting obligation, the main outcomes of the reports from the major social networks (Facebook, YouTube, and Twitter) and why the reports are unsuitable to serve as grounds for further development of the NetzDG or any similar regulation.

Newsletter

Subscribe to our newsletter and receive the Institute's latest news via email.

SUBSCRIBE!