The digital space is shaped by private companies and state regulation. Users are exposed to these dynamics directly and often without protection: business decisions by digital companies and government measures shape the space in which users spend time online - without them being able to influence it. This balance of power leads to a problematic imbalance.
In particular, users' rights to freedom of expression and information, academic freedom, equal treatment and the right to democratic participation are not sufficiently protected. Journalists, researchers and civil society actors in particular can be restricted in their work as a result - a threat to a vibrant democracy. For marginalized groups, restrictions on platforms can also mean exclusion from social debates.
With the Center for User Rights, the GFF aims to sustainably strengthen, demand and enforce the rights of users. The existing power imbalance between online platforms and their users is to be corrected and the foundations for a democratic digital society consolidated.
The focus of the Center for User Rights
The Center for User Rights bundles the GFF's cases and projects on platform regulation and aims to strengthen and enforce user rights in various areas.
Supporting the implementation and enforcement of the Digital Services Act is the focus of the Center's work. In addition, we continue to be involved in the introduction of a Digital Protection Against Violence Act and continue our work in the area of copyright law.
The Digital Services Act
With the Digital Services Act (DSA), the European Union lays down standardized rules according to which online platforms must remove, restrict or reactivate content. The DSA also regulates how platforms must report on their content moderation measures. It confirms that European fundamental rights must be respected in the platforms' general terms and conditions. The rights of researchers to access data from platform companies will be strengthened and expanded. In addition, special due diligence obligations will be introduced for very large online platforms and search engines, such as obligations to analyse and minimize systemic risks that their services may pose to society.
Unlike the General Data Protection Regulation, the DSA not only relies on national regulatory authorities, but also gives the European Commission additional centralized powers to directly supervise large online platforms such as X, Meta and Google. This is because the services of these platforms can pose considerable systemic risks to democratic discourse and the fundamental rights of their users due to their reach and must therefore be regulated particularly strictly.
The DSA promotes cooperation between national enforcement bodies and also gives civil society organizations a comprehensive mandate to enforce users' rights. We want to use this mandate and defend the fundamental rights of users with strategic lawsuits. We will focus on four areas:
- More transparency and commitment to fundamental rights in (automated) content moderation: platforms must work more transparently and respect the fundamental rights of their users when moderating, e.g. deleting content.
- Access to research data: Platforms shroud themselves in silence when it comes to their own data, making research into the influence of platforms on democratic discourse almost impossible. Platforms must release relevant data and thus respect academic freedom.
- Enforcement of user rights in the design of platforms: A large number of provisions in the DSA aim to make online platforms and their practices and policies user-friendly and compliant with fundamental rights. For example, it must be disclosed how platforms recommend certain content to users. Other priorities include rules on online advertising and the proactive disclosure of data to law enforcement authorities.
- Protection against discrimination: When using algorithms for deletion and blocking decisions or selecting which content is displayed to whom, there is a great risk of discrimination. We use the new legal possibilities to counteract these risks.
Digital protection against violence
Digital violence not only causes great suffering for the people affected - it is also a threat to our democracy. A vibrant democracy needs communication spaces in which people can express their opinions (fearlessly) - including online. Otherwise, relevant opinions will disappear and the diversity of opinions will be in danger.
Together with the Alfred Landecker Foundation, we have launched the Marie Munk Initiative - a project that defends fundamental rights in the digital space. As part of the Center for User Rights, the Marie Munk Initiative fights to improve protection against digital violence. To this end, we have presented a draft for a Digital Violence Protection Act and are supporting the legislative process for a law to improve the protection against digital violence.
More information on the project
User rights within the framework of copyright law
Free communication is fundamental to a vibrant democracy and is closely linked to numerous civil liberties, in particular the freedom of science, information, opinion and art. Copyright law, which in the pre-digital age mainly concerned professional creatives and media companies, now often comes into conflict with freedom of communication.
This has consequences for science and teaching, for activists, but also for the authors themselves. As part of the Center for User Rights, the control © project defends user rights in the area of conflict between freedom of communication and copyright.
More information about the project