The European Parliament has agreed to establish by law measures to curb illegal content, hold digital platforms accountable for their algorithms and improve content moderation. That is, measures against illegal products, services and content, including clear procedures for their removal; more options for advertising without tracking and prohibition of the use of data of minors to offer them personalized advertisements; those who acquire a service over the internet may claim compensation for damages; Mandatory risk assessments and greater transparency in algorithms to fight harmful content and disinformation.
These and others are some of the elements contained in the Digital Services Law (DSA), presented by the European Commission in December 2020, and on which MEPs have set a position this Thursday: new rules to address illegal content, to ensuring that platforms such as Google, Facebook and Amazon are held accountable for their content moderation algorithms and best practices.
The text approved this Thursday by Parliament by 530 votes in favor, 78 against and 80 abstentions will serve as the basis for negotiations with the French rotating presidency of the Council of the EU, which represents the Member States.
Socialist MEP Adriana Maldonado, a member of the Internal Market Commission, responsible for the file, stated: “This law is an opportunity to change the digital world forever: it includes the right to compensation for violations of the law and that minors are protected in targeted advertising”.
After the vote, Christel Schaldemose (S&D, Denmark), the MP who heads the Parliament’s negotiating team, said: “It is our obligation to ensure that what is illegal in the real world is also illegal in the virtual world. We have to make sure that the rules benefit the consumers and citizens.
The proposed Digital Services Law (DSA) defines clear responsibilities for intermediary service providers and specifically for online platforms, such as social networks or digital markets.
The Digital Services Law establishes a “notice and action” mechanism, in addition to safeguards, for the removal of illegal products, services or content from digital platforms. Hosting service providers must act upon receipt of such notifications “without undue delay, taking into account the type of illegal content being reported and the urgency of taking action.” MEPs have also included stricter safeguards to ensure that notifications are processed in a non-arbitrary and non-discriminatory manner and respecting fundamental rights, including freedom of expression.
The European Parliament insists that consumers must be able to buy safe products on digital markets, so the obligation to track merchants (the “know your customer” principle) must be strengthened.
Online platforms considered to be very large will be subject to specific obligations due to the specific risks they pose in the dissemination of both illegal and harmful content. The Digital Services Law will help limit harmful content (which may not be illegal) and misinformation. To do this, it will include several provisions: mandatory risk assessments, risk mitigation measures, independent audits, and transparency of so-called “recommender systems” (algorithms that determine what users see).
Parliament introduced several more modifications to the Commission’s proposal, asking, for example: that micro and small businesses be exempted from complying with some obligations of the Digital Services Law;
Personalized advertising: the text seeks that its recipients can decide with greater transparency and knowledge of the facts, including information on the way in which their data will be monetized. Withholding consent should not be more difficult or time-consuming than giving consent. In case of refusal to give consent or withdraw it, the platforms must offer other access options, including the possibility of ads without tracking; Targeting or amplification techniques based on the personal data of minors in order to display advertising will be prohibited, nor will personalized advertising be made based on special categories of data that allow targeting vulnerable groups; that the recipients of digital services and the organizations that represent them can claim damages when the platforms have not respected their due diligence obligations; that digital platforms be prohibited from misleading or conditioning users to influence their behavior through “dark patterns”; more algorithm-based ranking possibilities: very large online platforms should at least provide a recommendation system that is not based on profiling.
Other amendments approved in plenary have to do with the need for suppliers to respect freedom of expression and freedom and pluralism of the media in their terms and conditions, as well as a new provision on the right to use and pay for digital services anonymously.