Jump to content
Marie Munk Initiative Hate Speech rot Photo by Firmbee.com on Unsplash
Democracy and fundamental rights
Art. 1

The Marie Munk Initiative - with law against hatred on the internet

In May 2023, we published a draft for a Digital Violence Protection Act.

Hate and incitement to hostility on the Internet are not only distressing for the person concerned - they are also a danger to our democracy. Only when citizens are able to express their opinions freely and without fear can we create a climate that is conducive to a vibrant democracy. With the Marie Munk Initiative, we are launching a project that defends fundamental rights in the digital space as well. The initiative is named after the Berlin judge of the same name; the project's ambassador is the pianist Igor Levit.

The first objective of the Marie Munk Initiative is to develop a draft for a Protection Against Digital Violence Act. This is intended to create a legal basis for the judicial blocking of accounts that disseminate content relevant to criminal law. Unlike in the past, courts will be able to block accounts without having to identify the person behind them. This strategy does not require the use of a real name or the storage of data, and also aims to protect users' freedom of expression. The planned Protection Against Digital Violence Act is intended to serve as a blueprint for the new German government - and to shift responsibility away from private companies and back to the rule of law. In addition to the draft law, the expansion and development of counseling services for people affected by digital violence is to be encouraged.


A study commissioned by the GFF and conducted with 1,000 representatively selected respondents shows how urgently the topic of digital violence protection must be addressed: 67 percent of respondents stated that they had already experienced hate and incitement online. One in five had already been insulted on the Internet, and among young women, one in four had even been affected by digital violence. In addition, one in three young women was afraid that intimate pictures of her would be published online.

The need for effective measures and strategies is there - and thus also a mandate for the new federal government, which has announced a Digital Violence Protection Act in its coalition agreement. However, our study also shows that only a few of those surveyed have confidence in the parties when it comes to developing effective and consistent measures for dealing with digital violence. At 13 percent, respondents are most likely to trust the SPD, while the rest of the parties do not make it past the single-digit mark.

The study Digital Violence Protection, conducted by Kantar Public (2021), can be found here (in German):

Download overall study

Download summary


Past experience shows that social media platforms do not crack down on digital violence consistently enough. This opinion also prevails among the population, as our study shows: More than 60 percent of respondents say that social media platforms do not take sufficient action against hatred to protect users: Twitter (82 percent) and Instagram (74 percent) in particular score poorly here - and are also the places where women in particular experience digital violence. At the same time, our study shows that respondents would like to see a shift in responsibility. More than 70 percent are in favor of courts, not social media platforms, deciding whether to block accounts. Almost 90 percent also agree that courts should be able to block individual social media accounts in the event of legal violations - and without having to identify the person behind them. Importantly, criminal proceedings against the responsible individuals would still be initiated as a next step. The approach of the Marie Munk Initiative is designed in such a way that priority victims can be helped quickly.


The way is clear under European law for a Digital Protection against Violence Act. This is confirmed by an expert opinion commissioned by the GFF from the Institute for European Media Law (EMR) by Prof. Dr. Mark Cole and Dr. Jörg Ukrow, LL.M. Eur. In it, the experts illuminate the leeway left to national legislators under the EU's Digital Services Act (DSA). The DSA allows member states to shape the civil liability of online platforms and merely provides them with a "formal framework" for doing so. In particular, EU law does not prevent the national legislator from allowing for account suspensions ordered by a court. Such new measures can be justified by legitimate public interests. An accompanying regulation on national agents for service of process is also possible. Such a regulation would help both victims of digital violence and users who are affected by arbitrary moderation decisions of the platforms. At the same time, the GFF publishes a position paper in which it formulates demands to the legislator based on the results of the expert opinion. The EMR report and the GFF position paper can be found here (in German):


In May 2023, we presented our draft for a Digital Violence Protection Act. We need:

  1. claims bases that are open to the future and specifically address problems,
  2. simplified procedures in which courts decide effectively according to the rule of law, and
  3. the possibility for civil society organisations to support victims in proceedings or to conduct proceedings on their own initiative.

The core of the draft is a legal basis that empowers judges to take the necessary and, in individual cases, proportionate measures to end digital violence and prevent further violations. These measures include, in particular, temporary or permanent account blocks. This is especially necessary in cases where platforms do not block accounts due to their terms of use, even though their actions violate national law, and/or where they place their own (economic) interests above the fundamental rights of those affected.

According to the draft, in addition to affected persons themselves, civil society organisations that advise affected persons and combat digital violence can also file corresponding applications with the court and conduct the proceedings for the affected persons, but also on their own behalf.

It is important to us that these protective measures, such as account blocks, do not require the identification of the account holder, which is necessary for claims against the individual violators. In this way, we quickly put an end to digital violence instead of wasting valuable time trying to establish the identity of those responsible. There is no need for a clear-name obligation for the use of social networks, nor for exaggerated obligations to provide information, which quickly undermine anonymity on the internet and are often accompanied by calls for new storage obligations. Otherwise, the important fight against digital violence quickly becomes a backdoor for a new debate on data retention.

You can find our draft for a Digital Violence Protection Act and FAQ here (in German):


The key points for a Digital Protection against Violence Act, which the Federal Ministry of Justice presented in April, in our view focus unnecessarily on extended claims for information. It is therefore primarily about revealing the identity of individual violators. Although this makes it possible to enforce civil law claims such as injunctive relief and damages against these persons, these proceedings will take longer and longer and will not offer any quick protection for those affected. The possibility of account blocking, which the ministry fortunately also grants, is again much too narrowly defined. Serious acts of digital violence such as incitement of the people are excluded from the scope of application because no individual person is affected.

Our commentary on the Federal Government's key points (as of 4 May 2023) can be found here (in German):


Another central concern of the Marie Munk Initiative is the establishment and expansion of counselling structures for victims of digital violence. This project was also announced in the coalition agreement as a component of the law against digital violence, but is not yet reflected in the BMJ's key points. Strengthening counselling services is not only relevant for victims. It is also important so that counselling structures can support victims in proceedings or also lead them on their own initiative, as provided for in our draft law. In order to realise the expansion of counselling structures for victims of digital violence as comprehensively and as early as possible, we - together with our partner organisations from the F5 alliance (Algorithmwatch, Society for Freedom Rights, Open Knowledge Foundation Germany, Reporters without Borders, Wikimedia Germany) - have been involved in the current legislative process for the so-called Democracy Promotion Act (Act to Strengthen Measures to Promote Democracy, Prevent Extremism and Political Education). So far, we have submitted two statements to the responsible ministries. In these, we demand, among other things, that the work against digital violence and corresponding counselling services for affected persons be included in the law as an independent subject worthy of support. Personal digital violence also has a direct impact on democratic participation. Hate speech, disinformation and digital violence are threats to our fundamental rights. A Democracy Promotion Act must therefore jointly address the promotion of civil society engagement in these areas and the promotion of counselling services.

Our statements on the Democracy Promotion Act can be found here (in German):

Grundrechte verteidigen.
Fördermitglied werden!