Updates to our Terms of Use

We are updating our Terms of Use. Please carefully review the updated Terms before proceeding to our website.

Monday, April 15, 2024 | Back issues
Courthouse News Service Courthouse News Service

EU preps far-reaching regulation of digital platforms

Two legal requirements for online platforms — swift response to complaints and transparency regarding algorithms — have been so far absent from codified EU law. The new Digital Services Act is about to change that.

STRASBOURG, France (CN) — European Union lawmakers and regulators are closing in on a sweeping plan to regulate digital platforms like Facebook, Twitter, Intagram and YouTube that will both force transparency in how their algorithms operate and make it easier to have offensive content removed.  

“The regulative will have a massive impact because we actually start to regulate the platforms. Now, we can hold them responsible for the algorithms used for data sharing. And secure transparent mechanisms behind the removal of user content," said Christel Schaldemose, a Danish member of the European Parliament’s biggest political group, The Progressive Alliance of Socialists and Democrats. She is rapporteur and leads the push to adopt the comprehensive Digital Services Act.

The act was initially presented by executive vice president of the European Commission Margrethe Vestager in 2020. She called it “a milestone in our journey to make Europe fit for the digital age." Its twin, the Digital Markets Act, deals with fair competition and market power dynamics online.

The Digital Services Act is in final negotiations between EU regulators and the bloc's 27 member dates. But why is the act so significant? Courthouse News Service has asked Rikke Frank Jørgensen, a senior researcher at Copenhagen’s Institute for Human Rights who has for years studied technology and human rights.

“It is important that our democracy set rules for the power, which private companies can excerpt over citizens in shared digital spaces. The new regulation updates the Electronic Commerce Directive from 2000. It is very timely because big social media platforms have an incredible influence through their role as public debate forums,” she says.

Jørgensen believes any legislation on digital content is fundamentally tricky to balance. On one side, places like Facebook, Twitter and Google should remain forums for freedom of expression and participation. But the platforms also use sophisticated artificial intelligence to target and distribute content for specific commercial and ranking purposes in ways hitherto relatively hidden from the public eye.

“Today, commercial platforms influence our ability and rights to participate politically, culturally and socially in society. All while they profit from our personal data. Many use these services as a default communication option, but the companies analyze you and affect your opinions in ways that are not at all transparent,” Jørgensen said.

She noted the nature of social media platforms makes them difficult to regulate. While news media outlets have an editor-in-chief who takes responsibility for all online publications, Facebook and similar platforms define themselves as “open communities." They are technically an internet service, even though they really function as information distributors and opinion shapers — categorically, they are in-betweens offering great power with very limited responsibility.

The Digital Services Act seeks to change that by encompassing all active services that moderate content and advertise products. However, Schaldemose’s team underlines that the act does not define illegal content or harmonize penalty rules across member states. Instead, it states the legal modus operandi, when citizens or trusted flaggers report on problematic cases. 

European Parliament member Christel Schaldemose speaks about the Digital Service Act at the plenary. (Brigitte Hase/EP 2022)

Providers must now establish an internal system to handle complaints and ensure a timely response. Users reporting potentially illegal content are entitled to an explanation of their complaint was handled and what decision was made. And each member state must select a digital authority for users to turn to when they are unsatisfied with the platform's response.

Worst case, violations can trigger a fine of 6% of a company’s annual revenue — quite a stick to motivate any digital service to have constant vigilance. In 2021, Facebook parent Meta reported $117 billion in revenue. Google took in $257 billion.

The European Commission has built it on a so-called asymmetric due diligence system, where responsibility increases with the complexity of the service, Schaldemose´s office said. As a result, an intermediary webhost service simply providing Wi-Fi to café customers is subject to fewer regulatory criteria than a social media platform.

Mandatory risk assessments are the other big pillar of the regulation. The EU now requires yearly evaluations of how the use of algorithms and artificial intelligence can negatively affect either society at large or the physical and mental well-being of individual users.

Companies will report risk in different categories, like harmful effects on the body, and publish the results. While many big platforms already screen for disinformation and dangerous content, systemic legislation has not been put in place before.

”The new rules focus on the whole process around illegal content," Jørgensen said. "What a notice of action should look like, what response the platforms must give, where citizens can complain, how risk assessments must be carried out, and which authorities carry the primary responsibility for checking up. They also touch upon transparency and algorithmic accountability. It is a major piece of legislation.”

As of press time, the negotiating EU parties are discussing additions, such as a ban on marketing to minors online. The focus has intensified after Facebook whistleblower Francis Haugen gave a critique before U.S. lawmakers of the addictive effects of Instagram algorithms for 10-year olds.  

Pundits expect the European Parliament will sign off on the Digital Services Act by July, if for no other reason than France holds the current presidency of the Council of the European Union and it would be a feather in the nation's cap to adopt it.

Courthouse News correspondent Mie Olsen is based in Copenhagen, Denmark.


For a deeper dive into the EU's privacy efforts in the digital age, listen to Courthouse News's podcast Sidebar and its episode on "the right to be forgotten."


Categories / Civil Rights, Government, Law, Media, Politics, Regional

Subscribe to Closing Arguments

Sign up for new weekly newsletter Closing Arguments to get the latest about ongoing trials, major litigation and hot cases and rulings in courthouses around the U.S. and the world.

Loading...