search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'manage' . Output generated live by software developed by IusOnDemand srl


expand index manage:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms
  • 1 Art. 24 Transparency reporting obligations for providers of online platforms

  • SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 1 Art. 32 Right to information
  • 1 Art. 37 Independent audit
  • 16 Art. 41 Compliance function

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators
  • 1 Art. 48 Crisis protocols
  • 1 Art. 51 Powers of Digital Services Coordinators

  • SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas manage:


definitions:


cloud tag: and the number of total unique words without stopwords is: 693

 

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 32

Right to information

1.   Where a provider of an online_platform allowing consumers to conclude distance_contracts with traders becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider shall inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following:

(a)

the fact that the product or service is illegal;

(b)

the identity of the trader; and

(c)

any relevant means of redress.

The obligation laid down in the first subparagraph shall be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality.

2.   Where, in the situation referred to in paragraph 1, the provider of the online_platform allowing consumers to conclude distance_contracts with traders does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online_interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress.

SECTION 5

Additional obligations for providers of very large online_platforms and of very large online_search_engines to manage systemic risks

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 41

Compliance function

1.   Providers of very large online_platforms or of very large online_search_engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online_platform or of the very large online_search_engine to monitor the compliance of that provider with this Regulation.

2.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that compliance officers have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3.

The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that the head of the compliance function is an independent senior manager with distinct responsibility for the compliance function.

The head of the compliance function shall report directly to the management body of the provider of the very large online_platform or of the very large online_search_engine, and may raise concerns and warn that body where risks referred to in Article 34 or non-compliance with this Regulation affect or may affect the provider of the very large online_platform or of the very large online_search_engine concerned, without prejudice to the responsibilities of the management body in its supervisory and managerial functions.

The head of the compliance function shall not be removed without prior approval of the management body of the provider of the very large online_platform or of the very large online_search_engine.

3.   Compliance officers shall have the following tasks:

(a)

cooperating with the Digital_Services_Coordinator_of_establishment and the Commission for the purpose of this Regulation;

(b)

ensuring that all risks referred to in Article 34 are identified and properly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35;

(c)

organising and supervising the activities of the provider of the very large online_platform or of the very large online_search_engine relating to the independent audit pursuant to Article 37;

(d)

informing and advising the management and employees of the provider of the very large online_platform or of the very large online_search_engine about relevant obligations under this Regulation;

(e)

monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with its obligations under this Regulation;

(f)

where applicable, monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with commitments made under the codes of conduct pursuant to Articles 45 and 46 or the crisis protocols pursuant to Article 48.

4.   Providers of very large online_platforms or of very large online_search_engines shall communicate the name and contact details of the head of the compliance function to the Digital_Services_Coordinator_of_establishment and to the Commission.

5.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall define, oversee and be accountable for the implementation of the provider's governance arrangements that ensure the independence of the compliance function, including the division of responsibilities within the organisation of the provider of very large online_platform or of very large online_search_engine, the prevention of conflicts of interest, and sound management of systemic risks identified pursuant to Article 34.

6.   The management body shall approve and review periodically, at least once a year, the strategies and policies for taking up, managing, monitoring and mitigating the risks identified pursuant to Article 34 to which the very large online_platform or the very large online_search_engine is or might be exposed to.

7.   The management body shall devote sufficient time to the consideration of the measures related to risk management. It shall be actively involved in the decisions related to risk management, and shall ensure that adequate resources are allocated to the management of the risks identified in accordance with Article 34.

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators

Article 51

Powers of Digital Services Coordinators

1.   Where needed in order to carry out their tasks under this Regulation, Digital Services Coordinators shall have the following powers of investigation, in respect of conduct by providers of intermediary_services falling within the competence of their Member State:

(a)

the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including organisations performing the audits referred to in Article 37 and Article 75(2), to provide such information without undue delay;

(b)

the power to carry out, or to request a judicial authority in their Member State to order, inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium;

(c)

the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers with their consent by any technical means.

2.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall have the following enforcement powers, in respect of providers of intermediary_services falling within the competence of their Member State:

(a)

the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding;

(b)

the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end, or to request a judicial authority in their Member State to do so;

(c)

the power to impose fines, or to request a judicial authority in their Member State to do so, in accordance with Article 52 for failure to comply with this Regulation, including with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(d)

the power to impose a periodic penalty payment, or to request a judicial authority in their Member State to do so, in accordance with Article 52 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this subparagraph or for failure to comply with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(e)

the power to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm.

As regards the first subparagraph, points (c) and (d), Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after providing those other persons in good time with all relevant information relating to such orders, including the applicable period, the fines or periodic payments that may be imposed for failure to comply and the possibilities for redress.

3.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall, in respect of providers of intermediary_services falling within the competence of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted and the infringement has not been remedied or is continuing and is causing serious harm which cannot be avoided through the exercise of other powers available under Union or national law, also have the power to take the following measures:

(a)

to require the management body of those providers, without undue delay, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;

(b)

where the Digital Services Coordinator considers that a provider of intermediary_services has not sufficiently complied with the requirements referred to in point (a), that the infringement has not been remedied or is continuing and is causing serious harm, and that that infringement entails a criminal offence involving a threat to the life or safety of persons, to request that the competent judicial authority of its Member State order the temporary restriction of access of recipients to the service concerned by the infringement or, only where that is not technically feasible, to the online_interface of the provider of intermediary_services on which the infringement takes place.

The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 82, prior to submitting the request referred to in the first subparagraph, point (b), of this paragraph invite interested parties to submit written observations within a period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider of intermediary_services, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.

The restriction of access shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority. The Digital Services Coordinator shall only extend the period where, having regard to the rights and interests of all parties affected by that restriction and all relevant circumstances, including any information that the provider of intermediary_services, the addressee or addressees and any other third party that demonstrated a legitimate interest may provide to it, it considers that both of the following conditions have been met:

(a)

the provider of intermediary_services has failed to take the necessary measures to terminate the infringement;

(b)

the temporary restriction does not unduly restrict access to lawful information by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist.

Where the Digital Services Coordinator considers that the conditions set out in the third subparagraph, points (a) and (b), have been met but it cannot further extend the period pursuant to the third subparagraph, it shall submit a new request to the competent judicial authority, as referred to in the first subparagraph, point (b).

4.   The powers listed in paragraphs 1, 2 and 3 shall be without prejudice to Section 3.

5.   The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary_services concerned where relevant.

6.   Member States shall lay down specific rules and procedures for the exercise of the powers pursuant to paragraphs 1, 2 and 3 and shall ensure that any exercise of those powers is subject to adequate safeguards laid down in the applicable national law in compliance with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties.


whereas









keyboard_arrow_down