Modern European digital service and AI legislation

Views: 42

Recently, in mid-April, the Commission officially opened in the Joint Research Centre in Seville, Spain the European Centre for Algorithmic Transparency, ECAT. The ECAT will provide the Commission with in-house technical and scientific expertise to ensure that algorithmic systems used by the EU-wide online platforms and search engines comply with the risk management, mitigation and transparency requirements envisaged in the newly enforced digital service act, DSA.

     At the end of 2020, the Commission drafted the Digital Service Act, DSA together with the proposal on the Digital Markets Act, DMA; both formed a comprehensive framework to ensure a safer and fairer EU-wide digital space.
Following the political agreement reached by the EU co-legislators in April 2022, the DSA entered into force on 16 November 2022. The deadline for platforms and search engines to publish the number of their monthly active users was on 17 February 2023.
The DSA applies to all digital services that connect consumers to goods, services or content. It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users’ rights online, and places digital platforms under a unique new transparency and accountability framework. Designed as a single, EU-wide uniform set of rules for all member states, these rules will give users new protections and businesses legal certainty across the EU single market. The DSA is a first-of-a-kind regulatory toolbox globally and sets an international benchmark for a regulatory approach to online intermediaries.
Presently, the Commission is analysing options for designating Very Large Online Platforms and Very Large Online Search Engines, which will have four months from the designation to comply with all DSA obligations and in particular to submit their first risk assessment.
Thus, by mid-February 2024 the DSA will apply to all intermediary services; by the same date the EU member states are required to appoint national “digital service” coordinators.

Online platforms and search engines: mitigating risks
The Digital Services Act imposes risk management requirements for companies designated by the European Commission as Very Large Online Platforms, VLOP and Very Large Online Search Engines, VLOSE. Under this framework, designated platforms will have to identify, analyse and mitigate a wide array of systemic risks on their platforms, ranging from how illegal content and disinformation can be amplified through their services, to the impact on the freedom of expression or media freedom. Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated. The risk mitigation plans of designated platforms’ and search engines will be subject to an independent audit and oversight by the European Commission.
The European Center for Algorithmic Transparency, ECAT is kind of digital inspectorate which will assess the EU-wide large digital platforms and assist the Commission to enforce the EU’s new digital legislation, the DSA. The center will employ about 30 persons, including data scientists and legal officers to delve into platforms’ all-powerful codes and online search engines. Thus, the ECAT will provide the Commission with in-house technical and scientific expertise to ensure that algorithmic systems used by the VLOPs and VLOSEs will comply with the risk management, mitigation and transparency requirements in the DSA. This includes, amongst other tasks, the performance of technical analyses and evaluations of algorithms.
An interdisciplinary team of data scientists, AI experts, social scientists and legal experts will combine their expertise to assess their functioning and propose best practices to mitigate their impact. This will be crucial to ensure the thorough analysis of the transparency reports and risk self-assessment submitted by the designated companies, and to carry out inspections to their systems whenever required by the Commission.

Digital Service Act
The Digital Services Act (DSA) regulates the obligations of digital services that act as intermediaries in their role of connecting consumers with goods, services, and content. This includes online marketplaces amongst others. It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.
The European Parliament and Council reached a political agreement on the new rules on 23 April, 2022 and the DSA entered into force on 16 November 2022 after being published in the EU Official Journal on 27 October2022.
The e-Commerce Directive, adopted in 2000, has been the main legal framework for the provision of digital services in the EU. It is a horizontal legal framework that has been the cornerstone for regulating digital services in the European single market. Much has changed in 20 years and the rules need to be upgraded. Online platforms have created significant benefits for consumers and innovation, and have facilitated cross-border trading within and outside the Union and opened new opportunities to a variety of European businesses and traders. At the same time, they are abused for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They pose particular risks for users’ rights, information flows and public participation. In addition, the e-Commerce Directive did not specify any cooperation mechanism between authorities. The “Country of Origin” principle meant that the supervision was entrusted to the country of establishment.

DAS’s main features
The DAS include the following legal instruments:
= Measures to counter illegal content online, including illegal goods and services. The DSA imposes new mechanisms allowing users to flag illegal content online, and for platforms to cooperate with specialised ‘trusted flaggers’ to identify and remove illegal content;
= New rules to trace sellers on online market places, to help build trust and go after scammers more easily; a new obligation by online market places to randomly check against existing databases whether products or services on their sites are compliant; sustained efforts to enhance the traceability of products through advanced technological solutions;
= Effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions based on new obligatory information to users when their content gets removed or restricted;
= Wide ranging transparency measures for online platforms, including better information on terms and conditions, as well as transparency on the algorithms used for recommending content or products to users;
= New obligations for the protection of minors on any platform in the EU;
= Obligations for very large online platforms and search engines to prevent abuse of their systems by taking risk-based action, including oversight through independent audits of their risk management measures. Platforms must mitigate against risks such as disinformation or election manipulation, cyber violence against women, or harms to minors online. These measures must be carefully balanced against restrictions of freedom of expression, and are subject to independent audits;
= A new crisis response mechanism in cases of serious threat for public health and security crises, such as a pandemic or a war;
= Bans on targeted advertising on online platforms by profiling children or based on special categories of personal data such as ethnicity, political views or sexual orientation. Enhanced transparency for all advertising on online platforms and influencers’ commercial communications;
= A ban on using so-called ‘dark patterns’ on the interface of online platforms, referring to misleading tricks that manipulate users into choices they do not intend to make;
= New provisions to allow access to data to researchers of key platforms, in order to scrutinise how platforms work and how online risks evolve;
= Users will have new rights, including a right to complain to the platform, seek out-of-court settlements, complain to their national authority in their own language, or seek compensation for breaches of the rules. Representative organisations will also be able to defend user rights for large scale breaches of the law;
= A unique oversight structure. The Commission will be the primary regulator for very large online platforms (reaching 45 million users), while other platforms will be under the supervision of Member States where they are established. The Commission will have enforcement powers similar to those it has under anti-trust proceedings. An EU-wide cooperation mechanism will be established between national regulators and the Commission;
= The liability rules for intermediaries have been reconfirmed and updated by the co-legislator, including a Europe-wide prohibition of generalised monitoring obligations.
Reference to: https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348

DSA impact on business
The Digital Services Act applies to a wide range of online intermediaries, which include services such as internet service providers, cloud services, messaging, marketplaces, or social networks. Specific due diligence obligations apply to hosting services, and in particular to online platforms, such as social networks, content-sharing platforms, app stores, online marketplaces, and online travel and accommodation platforms. The most far-reaching rules in the Digital Services Act focus on very large online platforms; these platforms have had a significant societal and economic impact, reaching at least 45 million users in the EU (representing 10% of the population).
Similarly, very large online search engines with more than 10% of the 450 million consumers in the EU will bear more responsibility in curbing illegal content online.
The DSA modernises and clarifies rules dating back to the year 2000; it sets presently a global benchmark, under which online businesses will benefit from a modern, clear and transparent framework assuring that rights are respected and obligations are enforced.
Moreover, for online intermediaries, and in particular for hosting services and online platforms, the new rules will cut the costs of complying with 27 different regimes in the EU single market. This will be particularly important for innovative SMEs, start-ups and scale-ups, which will be able to scale at home and compete with very large players. Small and micro-enterprises will be exempted from some of the rules that might be more burdensome for them, and the Commission will carefully monitor the effects of the new Regulation on SMEs.
Other businesses will also benefit from the new set of rules: they will have access to simple and effective tools for flagging illegal activities that damage their trade, as well as internal and external redress mechanisms, affording them better protections against erroneous removal, limiting losses for legitimate businesses and entrepreneurs.
Furthermore, those providers which voluntarily take measures to further curb the dissemination of illegal content will be reassured that these measures cannot have the negative consequences of being unprotected from legal liability.
Source: https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348/14.xi.2022.

European approach to AI
Optimal functioning online platforms and search engines EU-wide and in the member states could not be attained without proper ECAT’s research and foresight capacity. Thus, the JRC researchers will build on and further advance their longstanding expertise in the field of artificial intelligence (AI), which has already been instrumental in the preparation of other milestone pieces of regulation like the AI Act, the Coordinated Plan on AI and its subsequent reviews.
Faced with the rapid technological development of AI and a global policy context where more and more countries are investing heavily in AI, the EU intends to act as a global leader in harnessing opportunities and addressing challenges of AI in a future-proof manner. To promote the development of AI and address the potential high risks it poses to safety and fundamental rights equally, the Commission presented both a proposal for a regulatory framework on AI and a revised coordinated plan on AI.
On coordinated plan in: https://digital-strategy.ec.europa.eu/en/library/coordinated-plan-artificial-intelligence-2021-review.

     For years, the Commission has been facilitating and enhancing cooperation on AI across the EU to boost its competitiveness and ensure trust based on EU values. Following the publication of the European Strategy on AI in 2018 and after extensive stakeholder consultation, the High-Level Expert Group on Artificial Intelligence (HLEG) developed Guidelines for Trustworthy AI in 2019 and an Assessment List for Trustworthy AI in 2020.
In parallel, the first Coordinated Plan on AI was published in December 2018 as a joint commitment with Member States.
The Commission’s White Paper on AI, published in 2020, set out a clear vision for AI in Europe: creating a system of excellence and trust and creating present proposal. The public consultation on the White Paper on AI elicited widespread participation from across the world. The White Paper was accompanied by a “Report on the safety and liability implications of Artificial Intelligence, the Internet of Things and robotics”; it concluded that the current product safety legislation contains a number of gaps that needed to be addressed, notably in the so-called “machinery directive”.
Additionally, Commission press release/21.04.2021, in: https://ec.europa.eu/commission/presscorner/detail/en/IP_21_1682

     ECAT researchers will not only focus on identifying and addressing systemic risks stemming from VLOPs and VLOSEs, but also investigate the long-term societal impact of all existing algorithms.

More information in the Commission’s websites: = European Centre for Algorithmic Transparency (ECAT); = EU Official Journal text on the DSA; = Digital Services Act Q&A; = Digital Services Act fact page; and = The Digital Services Act package.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

two × 1 =