The European Parliament has given the final vote to approve the laws to regulate digital content and impose duties on giants such as Google, Facebook and Amazon: the Digital Services Law (DSA) and the Digital Markets Law (DMA). “The objective is to address the social and economic effects of the technology industry by establishing standards for a safer and more open digital space for users, as well as a fairer digital market for companies”, says the European Parliament: “These are historical digital standards that will force companies on-line protect users from illegal content, increase liability and limit the market dominance of tech giants.”
The Digital Services Act (DSA) includes measures to curb illegal content, hold digital platforms accountable for their algorithms and improve content moderation. That is, measures against illegal products, services and content, including clear procedures for their removal; more options for advertising without tracking and prohibition of the use of data of minors to offer them personalized advertisements; those who acquire a service over the internet may claim compensation for damages; Mandatory risk assessments and greater transparency in algorithms to fight harmful content and disinformation.
The DSA is the first such standard for digital regulation and, as European Commission President Ursula von der Leyen has said, follows the principle that “what is illegal offline must also be illegal online”. on-line”. Its objective is to protect the digital space against the dissemination of illegal content and guarantee the protection of the fundamental rights of users.
These and others are some of the elements contained in the Digital Services Law (DSA), presented by the European Commission in December 2020, and on which the Council – the Governments, the European Parliament – reached an agreement at the end of April and the European Commission in the trilogues – negotiations between the three institutions. Thus, new rules are required to address illegal content, to ensure that platforms such as Google, Facebook and Amazon are accountable for their algorithms and content moderation best practices.
Just over a year ago, the European Commission proposed two legislative initiatives to update the rules governing digital services in the EU: the Digital Services Act (DSA) and the Digital Markets Act (DMA).
The DMA was agreed in March with the idea of putting limits on the power of internet giants such as Google, Apple, Meta or Amazon, after Brussels found that the free competition rules in place up to now have not prevented the abuse of power by the platforms. .
That antitrust law is supplemented by the DSA.
The position of the European Parliament was approved on January 20 by 530 votes in favor, 78 against and 80 abstentions.
Socialist MEP Adriana Maldonado, a member of the Internal Market Committee responsible for the file, states: “It is an unprecedented milestone that places us at the forefront in terms of security and transparency in the digital sector and allows us to lead a fair digitization that lead the way to other countries. From now on, the platforms must assume their responsibilities and operate in a legal framework that until now was practically non-existent.
The Digital Services Act (DSA) defines clear responsibilities for intermediary service providers and in particular for online platforms, such as social networks or digital markets.
The Digital Services Law establishes a “notice and action” mechanism, in addition to safeguards, for the removal of illegal products, services or content from digital platforms. Hosting service providers must act upon receipt of such notifications “without undue delay, taking into account the type of illegal content being reported and the urgency of taking action.”
Among the aspects that this will include, safeguards to protect fundamental rights are included; greater protection for victims of cyber-violence; the possibility to report if there is illegal content on the platform (which must be removed); increased security of products and services purchased through online marketplaces; and compensation for users and users, who will have the right to seek redress for any damage or loss suffered due to the infractions of the platforms. “All this will translate into less exposure to illegal content and greater protection and respect for the fundamental rights and freedoms of consumers,” Maldonado stressed.
Online platforms considered to be very large will be subject to specific obligations due to the specific risks they pose in the dissemination of both illegal and harmful content.
The Digital Services Law will help limit harmful content (which may not be illegal) and misinformation. To do this, it will include several provisions: mandatory risk assessments, risk mitigation measures, independent audits, and transparency of so-called “recommender systems” (algorithms that determine what users see).
When it comes to online advertising, “users will have better control over how their personal data is used,” says Maldonado. Additionally, targeted advertising is prohibited when sensitive data is involved (for example, based on sexual orientation, religion, ethnicity).
Likewise, the DSA also urges platforms accessible to minors to take specific measures to protect them, including a total ban on targeted advertising.
Similarly, Maldonado has highlighted the special measures for times of crisis included in this law. “When a crisis occurs, such as a threat to public health or safety, the Commission may require very large platforms to limit any urgent threats,” he noted.
Finally, a fundamental aspect addressed by the DSA is the fines that online platforms and search engines may receive if they fail to comply with the rules, and which could reach up to 6% of their worldwide turnover.