search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'involved' . Output generated live by software developed by IusOnDemand srl


expand index involved:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services
  • 1 Art. 14 Terms and conditions

  • SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms

    SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 1 Art. 35 Mitigation of risks
  • 1 Art. 41 Compliance function

  • SECTION 6
    Other provisions concerning due diligence obligations
  • 2 Art. 46 Codes of conduct for online advertising

  • CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines
  • 1 Art. 66 Initiation of proceedings by the Commission and cooperation in investigation

  • SECTION 5
    Common provisions on enforcement
  • 1 Art. 84 Professional secrecy

  • SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas involved:


definitions:


cloud tag: and the number of total unique words without stopwords is: 508

 

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 41

Compliance function

1.   Providers of very large online_platforms or of very large online_search_engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online_platform or of the very large online_search_engine to monitor the compliance of that provider with this Regulation.

2.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that compliance officers have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3.

The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that the head of the compliance function is an independent senior manager with distinct responsibility for the compliance function.

The head of the compliance function shall report directly to the management body of the provider of the very large online_platform or of the very large online_search_engine, and may raise concerns and warn that body where risks referred to in Article 34 or non-compliance with this Regulation affect or may affect the provider of the very large online_platform or of the very large online_search_engine concerned, without prejudice to the responsibilities of the management body in its supervisory and managerial functions.

The head of the compliance function shall not be removed without prior approval of the management body of the provider of the very large online_platform or of the very large online_search_engine.

3.   Compliance officers shall have the following tasks:

(a)

cooperating with the Digital_Services_Coordinator_of_establishment and the Commission for the purpose of this Regulation;

(b)

ensuring that all risks referred to in Article 34 are identified and properly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35;

(c)

organising and supervising the activities of the provider of the very large online_platform or of the very large online_search_engine relating to the independent audit pursuant to Article 37;

(d)

informing and advising the management and employees of the provider of the very large online_platform or of the very large online_search_engine about relevant obligations under this Regulation;

(e)

monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with its obligations under this Regulation;

(f)

where applicable, monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with commitments made under the codes of conduct pursuant to Articles 45 and 46 or the crisis protocols pursuant to Article 48.

4.   Providers of very large online_platforms or of very large online_search_engines shall communicate the name and contact details of the head of the compliance function to the Digital_Services_Coordinator_of_establishment and to the Commission.

5.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall define, oversee and be accountable for the implementation of the provider's governance arrangements that ensure the independence of the compliance function, including the division of responsibilities within the organisation of the provider of very large online_platform or of very large online_search_engine, the prevention of conflicts of interest, and sound management of systemic risks identified pursuant to Article 34.

6.   The management body shall approve and review periodically, at least once a year, the strategies and policies for taking up, managing, monitoring and mitigating the risks identified pursuant to Article 34 to which the very large online_platform or the very large online_search_engine is or might be exposed to.

7.   The management body shall devote sufficient time to the consideration of the measures related to risk management. It shall be actively involved in the decisions related to risk management, and shall ensure that adequate resources are allocated to the management of the risks identified in accordance with Article 34.

Article 46

Codes of conduct for online advertising

1.   The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level by providers of online_platforms and other relevant service providers, such as providers of online advertising intermediary_services, other actors involved in the programmatic advertising value chain, or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for actors in the online advertising value chain beyond the requirements of Articles 26 and 39.

2.   The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information that fully respects the rights and interests of all parties involved, as well as a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct at least address the following:

(a)

the transmission of information held by providers of online advertising intermediaries to recipients of the service concerning the requirements set in Article 26(1), points (b), (c) and (d);

(b)

the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 39;

(c)

meaningful information on data monetisation.

3.   The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.

4.   The Commission shall encourage all the actors in the online advertising value chain referred to in paragraph 1 to endorse the commitments stated in the codes of conduct, and to comply with them.

Article 66

Initiation of proceedings by the Commission and cooperation in investigation

1.   The Commission may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 73 and 74 in respect of the relevant conduct by the provider of the very large online_platform or of the very large online_search_engine that the Commission suspect of having infringed any of the provisions of this Regulation.

2.   Where the Commission decides to initiate proceedings pursuant to paragraph 1 of this Article, it shall notify all Digital Services Coordinators and the Board through the information sharing system referred to in Article 85, as well as the provider of the very large online_platform or of the very large online_search_engine concerned.

The Digital Services Coordinators shall, without undue delay after being informed of initiation of the proceedings, transmit to the Commission any information they hold about the infringement at stake.

The initiation of proceedings pursuant to paragraph 1 of this Article by the Commission shall relieve the Digital Services Coordinator, or any competent authority where applicable, of its powers to supervise and enforce provided for in this Regulation pursuant to Article 56(4).

3.   In the exercise of its powers of investigation under this Regulation the Commission may request the individual or joint support of any Digital Services Coordinators concerned by the suspected infringement, including the Digital_Services_Coordinator_of_establishment. The Digital Services Coordinators that have received such a request, and, where involved by the Digital Services Coordinator, any other competent authority, shall cooperate sincerely and in a timely manner with the Commission and shall be entitled to exercise their investigative powers referred to in Article 51(1) in respect of the provider of the very large online_platform or of the very large online_search_engine at stake, with regard to information, persons and premises located within the territory of their Member State and in accordance with the request.

4.   The Commission shall provide the Digital_Services_Coordinator_of_establishment and the Board with all relevant information about the exercise of the powers referred to in Articles 67 to 72 and its preliminary findings referred to in Article 79(1). The Board shall submit its views on those preliminary findings to the Commission within the period set pursuant to Article 79(2). The Commission shall take utmost account of any views of the Board in its decision.

Article 84

Professional secrecy

Without prejudice to the exchange and to the use of information referred to in this Chapter, the Commission, the Board, Member States’ competent authorities and their respective officials, servants and other persons working under their supervision, and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 72(2), shall not disclose information acquired or exchanged by them pursuant to this Regulation and of the kind covered by the obligation of professional secrecy.


whereas









keyboard_arrow_down