European politicians reach new deal on social media regulations
Digital Services Act could compel online platforms to share more information, address harmful content.
Europe is again raising the bar on tech regulation, preparing new standards for operations of social media and other digital platforms that are expected to ripple globally.
Politicians in the European Union reached an agreement on the main points of a new law aimed at forcing tech companies to take more responsibility for the content their users post online, part of a sweeping package of regulations introduced by the EU to set new rules on digital competition and services.
The Digital Services Act is set to introduce a variety of requirements for online platforms, including standards for taking down illegal content, a ban on targeted advertising aimed at children and new obligations on vetting third-party sellers. Very large platforms, defined as those serving more than 45 million users in the EU, would also be expected to complete risk assessments and allow regulators to access the algorithms they use to determine what content users see.
The text of the deal wasn’t immediately available, but a statement from the European Parliament outlined some of the central points negotiators had agreed to.
The legislation “will set new global standards,” Christel Schaldemose, a member of the European Parliament from Denmark who has been the body’s lead negotiator for the DSA, said in the statement. “We have finally made sure that what is illegal offline is also illegal online.”
The EU isn’t alone in pursuing new regulations for digital companies. In the US, the Biden administration has put its support behind antitrust legislation that seeks to blunt dominant tech companies’ market power. A proposal in the UK aims to force companies to address harmful content such as material related to eating disorders or self-harm.
The new EU rules, as proposed, could affect a range of business practices, from the way online marketplaces like Amazon.com engage with third-party sellers to how social-media companies such as Meta Platforms’ Facebook or ByteDance’s TikTok respond to concerns about harmful posts or a decision to lock a user’s account.
Often referred to as the DSA, the legislation is the second part of an expansive EU package intended to set new rules for competition and online content, and could have widespread global impacts for companies and consumers. Its counterpart, the Digital Markets Act, was agreed to by politicians in March and is set to impose new competition rules, backed by hefty fines, on the world’s biggest tech companies.
The political agreement on the DSA clears a path for the legislation to move forward. It will still need final approval from the EU Parliament and representatives from EU countries, but that process is unlikely to result in any further, significant changes.
The DSA could serve as an example for other countries, some observers said. Europe has been a first mover on regulations in the past, including with its landmark privacy legislation known as the General Data Protection Regulation, or GDPR, and more recently with the Digital Markets Act.
“Europe has developed a bit of an appetite” for being the first to bring in new regulations, said Joris van Hoboken, a law professor at Vrije Universiteit Brussel, a university in Brussels. “The hope is that this will set an international standard.”
Negotiators began discussions Friday morning in a meeting that was widely expected to result in a deal. Sticking points ahead of the talks included how to set limits on advertising targeting children or deals with sensitive data, and how to address so-called dark patterns, where a user might be manipulated into doing or agreeing to something they hadn’t intended.
Politicians also were expected to discuss a plan to charge large platforms a fee to cover the costs of enforcing the new rules.
The legislation isn’t meant to dictate what legal content digital platforms can and can’t allow online, experts said. Instead, it aims to set procedural standards for dealing with illegal content and ensuring that companies are applying their own terms and conditions fairly and consistently. It also seeks to compel platforms to share more information about how they make decisions — such as taking down content or locking a user’s account — and offer a process for users to complain if they disagree with those decisions.
One provision that was added late in the legislative process would allow regulators to require very large platforms “to limit any urgent threats” on those platforms during a crisis, such as a public security concern or a health threat like the Covid-19 pandemic. Requirements under the provision would be limited to three months, the statement from the European Parliament said.
Very large companies that don’t comply with the new rules could face fines of up to 6 per cent of their annual global revenue, according to an earlier draft of the legislation.
The legislation was formally proposed in December 2020 and has since moved swiftly through the EU’s sometimes cumbersome lawmaking process. Government and industry representatives have previously said that revelations from The Wall Street Journal’s Facebook Files series, which found that Facebook, now known as Meta, was aware that its platforms had flaws that caused harm to some users, accelerated the push.
The Computer and Communications Industry Association, a trade group that represents tech companies, has expressed support for the goals of the legislation but said some of the obligations appeared unrealistic, such as offering the possibility of redress for any user whose content a platform demotes.
As proposed, the DSA also contains obligations that would affect a range of other companies that aren’t considered online platforms, such as internet service providers and web hosting services. The obligations for those companies would be much more limited, compared with those meant to apply to online platforms.
That means hundreds of thousands of companies could fall under the scope of the new legislation, said Daphne Keller, who teaches internet law at Stanford University. She said many smaller companies likely aren’t aware of the new obligations.
Companies that do any kind of content moderation “probably need to spend the next six months hiring and building new processes,” she said. “This is going to be a heavy lift.”
The Wall Street Journal