The European Parliament adopts its position on the proposed law on digital services | Allen & Overy LLP

[ad_1]

The DSA aims to modernize and establish a uniform EU-wide framework on the treatment of illegal or potentially harmful content online, the liability of online intermediaries for third party content, the protection of users’ fundamental rights online and reducing information asymmetries between online intermediaries and their users. You can read key aspects of the European Commission’s proposal for DSA, published on December 15, 2020, in our blogs here and here.

Compared to the European Commission’s proposal, the main amendments introduced by the European Parliament are as follows:

  • targeting or amplification techniques that process, reveal or infer the personal data of minors and vulnerable groups, as well as special categories of data as defined by Article 9 of the GDPR (for example, personal data relating to political beliefs, religion or sexual orientation), are prohibited;
  • extend provisions that impose additional disclosure and transparency requirements for targeted advertising, including in relation to informing users of digital services about how their data will be monetised. The amendments will require online platforms to ensure that recipients of their services can opt out or withdraw consent for targeted ads in a way that is no more difficult or time-consuming than giving consent. Withholding consent to the processing of personal data for advertising purposes must not result in disabling access to Platform functionality and alternative access options (including no-tracking options) must be fair and reasonable;
  • users of digital services and the organizations that represent them must be able to seek compensation for any direct damage or loss resulting from the platforms’ failure to comply with their due diligence obligations;
  • intermediary service providers are not permitted to use deception or nudge techniques to influence user behavior through “dark patterns”. New DSA provisions prohibit vendors from giving visual visibility to any of the consent options and repeatedly requesting consent to data processing where consent has previously been withheld (including through the use of pop-ups );
  • extend the additional obligations to very large online platforms. For example, when carrying out mandatory risk assessments and implementing risk mitigation measures, these providers will need to consider fundamental rights. There are strict requirements for independent audits (e.g. to avoid auditor bias or dependency), ensuring transparency of ‘recommendation systems’ (the algorithms that determine what users see) and providing users at least one option that is not based on profiling; and
  • the terms and conditions of intermediary service providers must be fair, non-discriminatory and transparent, respect fundamental human rights and freedoms and be available in the language of the Member State to which the service is directed. These terms and conditions must be stated in clear and unambiguous terms and include information about all policies, procedures, measures and tools used for content moderation, including algorithmic decision-making, human review and the right to terminate the service. These terms and conditions should be supplemented by a concise and easily readable summary of the key elements. Conditions that do not comply with these requirements do not bind the recipients of the services.

Negotiations between the European Parliament and the Council of the EU to agree the text of the DSA should follow. The Council of the EU adopted its position on the DSA on 25 November 2021. You can read a summary of the Council’s position, prepared by Allen & Overy.

Read the press release “Digital Services Act: regulation of platforms for a safer online space for users” and the preliminary text as adopted by the European Parliament.

[ad_2]
Source link

Comments are closed.