EU Regulations Go into Effect, Requiring Risk Assessments and Mitigation Measures from Large Tech Companies
New European Union regulations went into effect today for large technology companies and search engines, requiring them to assess potential risks their platforms may cause and implement measures to deal with those risks.
The regulations are part of the EU’s Digital Services Act (DSA) and Digital Markets Act (DMA), which were approved in November 2022 with a months-long window to allow companies to prepare to comply. The two laws create a single set of rules that apply across the European Union and are designed to create a safer digital space that protects the fundamental rights of users of digital services while establishing a level playing field for competition, innovation, and growth.
The aspects of the regulations that went into effect this week are focused on very large service providers, or those that have more than 45 million EU users—10 percent of the EU’s population. These include Alibaba Aliexpress, the Amazon Store, the Apple AppStore, Facebook, Google Play, Instagram, LinkedIn, YouTube, TikTok, X (the site formerly known as Twitter), and more.
“Very large online platforms and very large online search engines cause societal risks, different in scope and impact from those caused by smaller platforms,” according to the DSA. “Providers of such very large online platforms and of very large online search engines should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact.”
Illegal offline = illegal online. Applicable today. 💪#DSA#UserProtection#OnlineDemocracy https://t.co/WkFts0aHji
— Margrethe Vestager (@vestager) August 25, 2023
The new rules require these companies to establish a point of contact for DSA regulation, report criminal offenses, have user-friendly terms of conditions, and be transparent about their advertising, recommender systems, and content moderation decisions.
Regulators focused on these factors because very large online platforms and search engines can “be used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as online trade,” the regulation explained.
Additionally, very large service providers must “identify, analyze, and assess systemic risks that are linked to their services,” according to the European Commission.
“In determining the significance of potential negative effects and impacts, providers should consider the severity of the potential impact and the probability of all such systemic risks,” the DSA explains. “For example, they could assess whether the potential negative impact can affect a large number of persons, its potential irreversibility, or how difficult it is to remedy and restore the situation prevailing prior to the potential impact.”
The DSA specifically points to systemic risks related to illegal content; fundamental rights (such as freedom of expression, media freedom and pluralism, discrimination, consumer protection, and children’s rights); public security and electoral processes; gender-based violence; public health; protection of minors; and mental and physical wellbeing.
“For example, such dissemination or activities may constitute a significant systemic risk where access to illegal content may spread rapidly and widely through accounts with a particularly wide reach or other means of amplification,” according to the DSA. “Providers of very large online platforms and of very large online search engines should assess the risk of dissemination of illegal content irrespective of whether or not the information is also incompatible with their terms and conditions.”
Once those risks are identified, these very large firms are required to put measures in place to mitigate those risks. These firms also have to create an internal compliance function; submit to yearly audits; share their data with the commission and national authorities; allow vetted researchers to access their platform data; provide options in their recommender systems not based on user profiling; and have a publicly available repository of advertisements.
Failure to comply with the regulations could result in fines of 6 percent of turnover or suspension of the service, the BBC reports. Very large service providers are subject to the regulations now, but smaller tech companies have until next year to comply with the rules.
Meta President of Global Affairs Nick Clegg wrote that the company—which owns Facebook, Instagram, Threads, and WhatsApp—has created a cross-functional team of more than 1,000 employees to develop solutions to meet the DSA’s requirements.
Some of those measures include expanding Meta’s Ad Library to display and archive for one year all ads that target people in the EU, as well as prohibiting targeted advertising based on their Meta app activity for people 13 to 17 years of age and making illegal content easier for users to report.
“And while we already notify people when we remove a piece of their content, and typically give them the chance to appeal, we’ll now provide this information to people in the EU for a broader range of content moderation decisions,” Clegg explained. “This includes when we apply feature limits to people’s accounts and when we restrict content for violating local law.”
Google also released an update on how it is addressing requirements in the DSA, such as expanding its Ads Transparency Center and increasing data access for researchers analyzing Google Search, YouTube, Google Maps, Google Play, and Google Shopping. Google will also roll out a new Transparency Center for people to access its policies on a product-by-product basis, find reporting and appeals tools, and view transparency reports, wrote Laurie Richardson, vice president of trust and safety at Google, and Jennifer Flannery O’Connor, vice president of product management at YouTube.
“We have long been aligned with the broad goals of the DSA and have devoted significant resources into tailoring our programs to meet its specific requirements,” Richardson and O’Connor wrote in their statement. “We have also expressed our concerns about potential unintended consequences, such as the risk of making it easier for bad actors to abuse our services and spread harmful misinformation by providing too much information about our enforcement approach.”
The DSA and DMA regulations were proposed after commissioners identified that citizens were increasingly exposed to risk and harm online that threatened their fundamental rights, but there was no coordinated ability to supervise how platforms were addressing these risks, according to an impact summary from the European Commission.
The commission found in 2020 that there was broad consensus to implement the regulations, which could result in a 1 to 1.8 percent increase of cross-border digital trade.
“Asymmetric rules will ensure that smaller emerging competitors are boosted, helping competitiveness, innovation, and investment in digital services, while targeting specific harms emerging through large platforms,” according to the summary. “Transparency and safety online, as well as the protection of fundamental rights will improve. Enhanced cooperation between Member States and the EU level governance will improve enforcement, and provide an up-to-date supervisory system for digital services.”