Digital platforms are a danger for democracy.
November 7, 2025
In January 2025, Elon Musk conducted an interview on X with Alice Weidel, the leader of Germany's far-right AfD party, some regional branches of which are considered right-wing extremist by German intelligence services.
"Only the AfD can save Germany. End of story," he said in an undisguised interference by a powerful social network in Germany's election campaign.
In Romania in 2024, the far-right presidential candidate Calin Georgescu won the first round of the elections to the surprise of many: The political outsider had not participated in any TV debates and had not invested any money in his campaign. His success came mainly through the video platform TikTok; his videos were very prominent in the feeds of many Romanians.
Suspicions quickly arose that social bots (automated accounts) and trolls (human users who are sometimes paid to act on behalf of a foreign body or government agency) must have been involved. The election was annulled. It is also known that bots and trolls have been used to manipulate public opinion in many other digital discussions and topics, such as Brexit and the COVID-19 pandemic.
Social media: Extreme positions and vocal minorities get most attention
What happens in the digital sphere can have a huge influence on public opinion. At a conference entitled "Big Tech and digital democracy: How much regulation does public discourse need?" organized by DW and the University of Cologne as part of a series of events on Global Media Law, media and constitutional law expert Dieter Dörr stated that "democracy is under serious threat."
Established and respected media outlets are present on these platforms and use Instagram, YouTube and others as channels for their content. But there are also numerous other players. They don't even have to be bots or trolls: There are many accounts that do not maintain certain standards, andincite hatred against others, spread false claims or use artificial intelligence (AI) to manipulate and generate images and videos.
The algorithms used by social media to decide what content is displayed when and shown to whom reward this kind of behavior.
"Extreme opinions, which have a wide-reaching scope, are pushed to the top," said Dörr, explaining that this is what keeps users on platforms for longer, allowing for more money to be earned from them.
EU's Digital Services Act offers glimmer of hope
Social media platforms have become an important, if not the only, source of information for many people. Politicians and researchers have long recognized that the power wielded by these platforms is a problem. But can anything be done about it?
The European Union (EU) has stepped up efforts in recent years to regulate the digital world, primarily through its Digital Services Act (DSA), which came into force in February 2024. It requires major online platforms and search engines such as Amazon, Google, X and Facebook to provide greater transparency and protection for users.
Renate Nikolay, the deputy director-general of the Directorate-General for Communications Networks, Content and Technology (DG CONNECT) at the European Commission, which is responsible for enforcing the Digital Services Act, says: "We are pursuing three principles: First, platforms must assess and minimize systemic risks. Second, we are strengthening users' rights, for example by providing complaint mechanisms. Third, we demand transparency in algorithms and require platforms to give researchers access to their data."
This sounds like a big step forward: Platforms have to provide information on their algorithms and even offer users the option to disable personalized content or advertising. After all, algorithms tend not only to disadvantage moderate and differentiated content. Ultimately, they also create filter bubbles or echo chambers, in which users are mainly surrounded by content and other users that reflect their own views. This puts them at risk of falling into a spiral of radicalization.
The TikTok algorithm is particularly notorious. A recent study by the University of Potsdam and the Bertelsmann Foundation showed that during the last German election campaign, political parties were not equally visible in the TikTok feeds of young users. Videos from official party accounts on the political fringes, especially the AfD, were played more frequently than those from the accounts of more centrist parties.
During the period under review, the AfD uploaded 21.5% percent of all the videos, but these accounted for 37.4% of videos that appeared in feeds. The AfD's videos were therefore overrepresented. For its part, the center-right CDU/CSU party of Chancellor Friedrich Merz uploaded 17.1% of all party videos, but these accounted for only 4.9% of videos in feeds.
When asked about this at the conference, Tim Klaws, Director of Government Relations and Public Policy for DACH, Israel and BeNeLux at TikTok, gave an evasive answer. He said that digital platforms had no interest in operating in an environment full of disinformation and populism, and were trying to minimize "fake news", hate speech, etc. with the help of AI and their staff members.
Finland promotes media literacy from kindergarten
Incidentally, apart from the DSA, there are other laws that regulate digital platforms, such as the European Media Freedom Act, which supplements the first by granting special status to recognized media outlets on large platforms — so that their content is treated transparently and cannot be removed without good reason.
Something else that experts say is important besides regulation: media literacy. People need to understand digital media better and use them more responsibly. Ultimately, users are the ones who post, consume, share and comment on content.
"Finland is exemplary in this regard, and we can learn something from them," says Nikolay.
Indeed, Finland has put in place a national strategy to promote media literacy, which starts as early as kindergarten and has resulted in Finns being very good at critically examining content and recognizing disinformation.
Only a combination of all the available measures can combat social media influence and online manipulation. But some experts such as Dörr remain skeptical: "There's not much that can be done against this tsunami." Particularly considering the fact that new challenges are constantly emerging, such as AI chatbots that provide false or biased information.
At the end of the DW conference, Nikolay made it clear that Europe was not "against the platforms," but wanted to work with them to "change business models so that they promote democracy rather than endanger it."
She said that one example of good cooperation was this year's parliamentary elections in Moldova. In the run-up to the polls, EU representatives, the Moldovan authorities, civil society actors and the operators of platforms such as Google, Meta, and TikTok had sat down together to counter disinformation and protect the electoral process.
This was apparently successful: A Russian disinformation campaign against the pro-European ruling Party of Action and Solidarity (PAS) did not have a decisive impact on the election results. PAS emerged as the election winner.
This article was translated from German.