search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'respectively' . Output generated live by software developed by IusOnDemand srl


expand index respectively:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms
  • 1 Art. 23 Measures and protection against misuse

  • SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 2 Art. 33 Very large online platforms and very large online search engines
  • 1 Art. 35 Mitigation of risks

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS
  • 1 Art. 89 Amendments to Directive 2000/31/EC


whereas respectively:


definitions:


cloud tag: and the number of total unique words without stopwords is: 399

 

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 33

Very large online_platforms and very large online_search_engines

1.   This Section shall apply to online_platforms and online_search_engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online_platforms or very large online_search_engines pursuant to paragraph 4.

2.   The Commission shall adopt delegated acts in accordance with Article 87 to adjust the number of average monthly active recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least by 5 % in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it shall adjust the number so that it corresponds to 10 % of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions.

3.   The Commission may adopt delegated acts in accordance with Article 87, after consulting the Board, to supplement the provisions of this Regulation by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1 of this Article and Article 24(2), ensuring that the methodology takes account of market and technological developments.

4.   The Commission shall, after having consulted the Member State of establishment or after taking into account the information provided by the Digital_Services_Coordinator_of_establishment pursuant to Article 24(4), adopt a decision designating as a very large online_platform or a very large online_search_engine for the purposes of this Regulation the online_platform or the online_search_engine which has a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1 of this Article. The Commission shall take its decision on the basis of data reported by the provider of the online_platform or of the online_search_engine pursuant to Article 24(2), or information requested pursuant to Article 24(3) or any other information available to the Commission.

The failure by the provider of the online_platform or of the online_search_engine to comply with Article 24(2) or to comply with the request by the Digital_Services_Coordinator_of_establishment or by the Commission pursuant to Article 24(3) shall not prevent the Commission from designating that provider as a provider of a very large online_platform or of a very large online_search_engine pursuant to this paragraph.

Where the Commission bases its decision on other information available to the Commission pursuant to the first subparagraph of this paragraph or on the basis of additional information requested pursuant to Article 24(3), the Commission shall give the provider of the online_platform or of the online_search_engine concerned 10 working days in which to submit its views on the Commission’s preliminary findings and on its intention to designate the online_platform or the online_search_engine as a very large online_platform or as a very large online_search_engine, respectively. The Commission shall take due account of the views submitted by the provider concerned.

The failure of the provider of the online_platform or of the online_search_engine concerned to submit its views pursuant to the third subparagraph shall not prevent the Commission from designating that online_platform or that online_search_engine as a very large online_platform or as a very large online_search_engine, respectively, based on other information available to it.

5.   The Commission shall terminate the designation if, during an uninterrupted period of one year, the online_platform or the online_search_engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1.

6.   The Commission shall notify its decisions pursuant to paragraphs 4 and 5, without undue delay, to the provider of the online_platform or of the online_search_engine concerned, to the Board and to the Digital_Services_Coordinator_of_establishment.

The Commission shall ensure that the list of designated very large online_platforms and very large online_search_engines is published in the Official Journal of the European Union, and shall keep that list up to date. The obligations set out in this Section shall apply, or cease to apply, to the very large online_platforms and very large online_search_engines concerned from four months after the notification to the provider concerned referred to in the first subparagraph.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 89

Amendments to Directive 2000/31/EC

1.   Articles 12 to 15 of Directive 2000/31/EC are deleted.

2.   References to Articles 12 to 15 of Directive 2000/31/EC shall be construed as references to Articles 4, 5, 6 and 8 of this Regulation, respectively.


whereas









keyboard_arrow_down