Digital platforms in Europe: the size matters

Views: 58

There are presently more than twenty digital platforms and search engines in the EU-27; they all have to comply with the general obligations under the digital law. Some main general provisions include the obligations for online marketplaces to: a) ensure the traceability of traders on their platforms; b) design their interface to facilitate traders’ compliance with the EU law; and c) inform consumers of some illegal products the moment they become aware of illegality.  

Background
Responsible and diligent behavior by providers of intermediary services is essential for a safe, predictable and trustworthy online environment and for allowing the EU citizens to exercise their fundamental rights guaranteed in the EU Charter of Fundamental Rights, in particular the freedom of expression and of information, the freedom to conduct a business, the right to non-discrimination and the attainment of a high level of consumer protection.
Information society services and especially intermediary services have become an important part of the European economy and the citizens’ daily life. After more than 20 years since the adoption of the legal framework of such services (laid down in Directive 2000/31), new and innovative business models and services, such as online social networks and online platforms allowing consumers to conclude distance contracts with traders, have allowed business users and consumers to access information and engage in transactions in novel ways. Millions of the EU citizens are widely using presently these services; however, the digital transformation and increased use of these services has also resulted in new risks and challenges for individual recipients of the relevant service, companies and society as a whole.
The new regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The EU-wide approximation of national regulatory measures on the requirements for providers of intermediary services have been necessary to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. In using technological requirements, innovations should not be hampered; instead they shall be stimulated, underlines the regulation.
The regulation on the “Single Market for Digital Services”, DSA was adopted in October 2022; it included over 90 (!) articles, as well as 156 (!) introductory and explanation notes. The regulation’s main purpose is to “contribute to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights, including the principle of consumer protection, are effectively protected”.

At the end of April 2024, the Commission has formally designated Shein as another VLOP (seemingly nr. 24) under the DSA; Shein is a fashion online retailer with an average of more than 45 million monthly users in the E, which is above the DSA threshold for designation as a VLOP. Following the designation as a VLOP, Shein will have to comply with the most stringent rules under the DSA within four months of its notification (i.e. by the end of August 2024), such as the obligation to adopt specific measures to empower and protect users online, including minors, and duly assess and mitigate any systemic risks stemming from their services.
https://ec.europa.eu/commission/presscorner/detail/en/ip_24_2326

Digital “gateways”
Some big platforms constitute an important “gateway” for business users to reach end users and therefore they are designated as “gatekeepers” by another European online regulation, i.e. the EU digital market act, DMA. The DMA aims to ensure contestable and fair markets in the digital sector: i.e. it regulates gatekeepers, which are large digital platforms that provide an important gateway between business users and consumers, whose position can grant them the power to create a bottleneck in the digital economy. In September 2023, the Commission designated six gatekeepers: Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft, which have had to fully comply with all DMA obligations already by March 2024.
The European Commission has designated on 29 April 2024 an Apple’s iPadOS (its operating system for tablets), as a gatekeeper under the DMA; Apple now has six months to ensure full compliance of iPadOS with the DMA obligations.
Source: https://ec.europa.eu/commission/presscorner/detail/da/ip_24_2363

On VLOPs and VLOSEs
In April 2023, the Commission designated the first 19 Very Large Operating Platforms, VLOPs and Very Large Online Search Engines, VLOSEs; from the end of August, these two kinds of operating platforms and online search engines had to comply with the additional obligations under the Digital Service Act, DSA described in art. 5 (e.g. complies with conditions on access to the information; complies with rules regarding the updating of the information, specified in a manner widely recognized and used by industry, etc.).
By mid-February 2024, all online intermediaries and platforms, with exceptions for small and microenterprises, had to comply with the general obligations introduced by the DSA. Besides, at the end of December 2023, three additional VLOPs were designated, for which additional obligations are becoming obligatory from the end of April 2024.
The supervision and enforcement of the DSA is shared between the Commission and Digital Services Coordinators, which had to be designated by Member States by 17 February 2024.
However, it is necessary to impose specific obligations on the providers of those platforms, in addition to the obligations applicable to all online platforms. This is vital due to the VLOPs importance, in particular by, e.g. the number of recipients of the service, in facilitating public debate, economic transactions and the dissemination to the public of information, opinions and ideas and in influencing how recipients obtain and communicate information online,
Due to their critical role in locating and making information retrievable online, it is also necessary to impose those obligations on VLOPs; those additional obligations on VLOPs and VLOSEs are necessary in view of the public policy concerns on less restrictive measures. But these platforms and engines may cause societal risks, different in scope and impact from those caused by smaller platforms. Providers of such large online platforms and large online search engines should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact.
Once the number of platforms and search engines’ active recipients, calculated as an average over a period of six months, reaches a significant share of the Union population, the systemic risks posed may have a disproportionate impact in the member states. Such significant share should be considered where an operational threshold is at 45 million (he number equivalent to 10 percent of the EU population; this operational threshold is updated by the Commission). An average number of active recipients of each service is doe individually: besides, the number of average monthly active recipients of an online platform should reflect all the recipients actually engaged with the service at least once in a given period of time, by being exposed to information disseminated on the online interface of the online platform, such as viewing it or listening to it, or by providing information, such as traders on an online platforms allowing consumers to conclude distance contracts with traders.
More on the issue in: https://digital-strategy.ec.europa.eu/en/policies/dsa-vlops

Categories of systematic “digital risks”
Both VLOPs and VLOSEs can be used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as online trade. The way they design their services is generally optimized to benefit their often advertising-driven business models and can cause societal concerns. Effective regulation and enforcement is necessary in order to effectively identify and mitigate the risks and the societal and economic harm that may arise. Under the regulation, VLOPS and VLOSEs providers should therefore assess the systemic risks stemming from the design, functioning and use of their services, as well as from potential misuses by the recipients of the service, and should take appropriate mitigating measures in observance of fundamental rights. In determining the significance of potential negative effects and impacts, providers should consider the severity of the potential impact and the probability of all such systemic risks. For example, they could assess whether the potential negative impact can affect a large number of persons, its potential irreversibility, or how difficult it is to remedy and restore the situation prevailing prior to the potential impact.
The regulation warned that four categories of systemic risks should be assessed in-depth by the VLOPs and VLOSEs providers:
= First category concerns the risks associated with the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech or other types of misuse of their services for criminal offences, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including dangerous or counterfeit products, or illegally-traded animals. For example, such dissemination or activities may constitute a significant systemic risk where access to illegal content may spread rapidly and widely through accounts with a particularly wide reach or other means of amplification. Providers of very large online platforms and of very large online search engines should assess the risk of dissemination of illegal content irrespective of whether or not the information is also incompatible with their terms and conditions. This assessment is without prejudice to the personal responsibility of the recipient of the service of very large online platforms or of the owners of websites indexed by very large online search engines for possible illegality of their activity under the applicable law.
= Second category concerns the actual or foreseeable impact of the service on the exercise of fundamental rights, as protected by the Charter, including but not limited to human dignity, freedom of expression and of information, including media freedom and pluralism, the right to private life, data protection, the right to non-discrimination, the rights of the child and consumer protection. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or by the very large online search engine or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. When assessing risks to the rights of the child, providers of very large online platforms and of very large online search engines should consider for example how easy it is for minors to understand the design and functioning of the service, as well as how minors can be exposed through their service to content that may impair minors’ health, physical, mental and moral development. Such risks may arise, for example, in relation to the design of online interfaces which intentionally or unintentionally exploit the weaknesses and inexperience of minors or which may cause addictive behavior.
= Third category of risks concerns the actual or foreseeable negative effects on democratic processes, civic discourse and electoral processes, as well as public security.
= Fourth category of risks stems from similar concerns relating to the design, functioning or use, including through manipulation, of very large online platforms and of very large online search engines with an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence. Such risks may also stem from coordinated disinformation campaigns related to public health, or from online interface design that may stimulate behavioral addictions of recipients of the service.
Source for citation’s references to regulation 2022/2065 in: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065

When assessing such systemic risks, the VLOPs and VLOSEs providers should focus on the systems or other elements that may contribute to the risks, including all the algorithmic systems that may be relevant, in particular their recommender systems and advertising systems, paying attention to the related data collection and use practices. They should also assess whether their terms and conditions and the enforcement thereof are appropriate, as well as their content moderation processes, technical tools and allocated resources.
When assessing the systemic risks identified in the regulation, those providers should also focus on the information which is not illegal, but contributes to the systemic risks identified in this Regulation. Such providers should therefore pay particular attention on how their services are used to disseminate or amplify misleading or deceptive content, including disinformation. Where the algorithmic amplification of information contributes to the systemic risks, those providers should duly reflect this in their risk assessments. Where risks are localized or there are linguistic differences, those providers should also account for this in their risk assessments. Providers of very large online platforms and of very large online search engines should, in particular, assess how the design and functioning of their service, as well as the intentional and, oftentimes, coordinated manipulation and use of their services, or the systemic infringement of their terms of service, contribute to such risks. Such risks may arise, for example, through the inauthentic use of the service, such as the creation of fake accounts, the use of bots or deceptive use of a service, and other automated or partially automated behaviors, which may lead to the rapid and widespread dissemination to the public of information that is illegal content or incompatible with an online platform’s or online search engine’s terms and conditions and that contributes to disinformation campaigns.
In February 2024 the Commission appointed the Digital Services Coordinators, i.e. the national authorities in charge of supervising and enforcing the DSA in the member states.
Source: https://digital-strategy.ec.europa.eu/en/policies/dsa-cooperation

Supplement. Digital services: legislative sources
= Directive 2000/31/EC of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (OJ L 178, 17.7.2000, p. 1).
= Directive 2015/1535 of 9 September 2015 laying down a procedure for the provision of information in the field of technical regulations and of rules on information society services (OJ L 241, 17.9.2015, p. 1).
= Directive 2010/13/EU of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in the states concerning the provision of audiovisual media services (Audiovisual Media Services Directive) (OJ L 95, 15.4.2010, p. 1).
= Regulation 2019/1150 of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services (OJ L 186, 11.7.2019, p. 57).
= Regulation 2021/784 of 29 April 2021 on addressing the dissemination of the terrorist content online (OJ L 172, 17.5.2021, p. 79).
= Directive 2002/58, 12.07.2002 on processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (OJ L 201, 31.7.2002, p. 37).
= Regulation 2017/2394/12.12.2017 on cooperation between national authorities responsible for the enforcement of consumer protection laws and repealing Regulation (EC) No 2006/2004 (OJ L 345, 27.12.2017, p. 1).
= Regulation 2019/1020 of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) No 765/2008 and (EU) No 305/2011 (OJ L 169, 25.6.2019, p. 1).
= Directive 2001/95/of 3 December 2001 on general product safety-OJ L 11, 15.1.2002, p. 4).
= Directive 2005/29 of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending previous directive and regulation on “unfair commercial practices”.
= Directive 2011/83 of October 2011 on consumer rights, in OJ L 304, 22.11.2011, p. 64.
= Directive 2013/11 of 21 May 2013 on alternative dispute resolution for consumer disputes, in OJ L 165, 18.6.2013, p. 63.
= Council Directive 93/13 of 5 April 1993 on unfair terms in consumer contracts, in OJ L 95, 21.4.1993, p. 29.
= Regulation 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, so-called General Data Protection Regulation, in OJ L 119, 4.5.2016, p. 1.
= Directive 2001/29 of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society, OJ L 167, 22.6.2001, p. 10.
= Directive 2004/48; 29.04. 2004 on the enforcement of intellectual property rights, in OJ L 157, 30.4.2004, p. 45.
= Directive 2019/790 of 17 April 2019 on copyright and related rights in the Digital Single Market, in OJ L 130, 17.5.2019, p. 92.
= Directive 2018/1972 of December 2018 establishing the European Electronic Communications Code, in OJ L 321, 17.12.2018, p. 36.
Source: https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
Additional info in: https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348

Leave a Reply

Your email address will not be published. Required fields are marked *

4 × one =