search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'implementation' . Output generated live by software developed by IusOnDemand srl


expand index implementation:


whereas implementation:


definitions:


cloud tag: and the number of total unique words without stopwords is: 860

 

Article 1

Subject matter

1.   The aim of this Regulation is to contribute to the proper functioning of the internal market for intermediary_services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter, including the principle of consumer protection, are effectively protected.

2.   This Regulation lays down harmonised rules on the provision of intermediary_services in the internal market. In particular, it establishes:

(a)

a framework for the conditional exemption from liability of providers of intermediary_services;

(b)

rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary_services;

(c)

rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 36

Crisis response mechanism

1.   Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online_platforms or of very large online_search_engines to take one or more of the following actions:

(a)

assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;

(b)

identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

(c)

report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.

When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

2.   For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

3.   When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:

(a)

the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;

(b)

the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;

(c)

the actions required by the decision are limited to a period not exceeding three months.

4.   After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:

(a)

notify the decision to the provider or providers to which the decision is addressed;

(b)

make the decision publicly available; and

(c)

inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.

5.   The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.

6.   The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).

7.   The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.

Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.

8.   Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:

(a)

revoking the decision and, where appropriate, requiring the very large online_platform or very large online_search_engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;

(b)

extending the period referred to paragraph 3, point (c), by a period of no more than three months;

(c)

taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

9.   The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.

10.   The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.

11.   The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 41

Compliance function

1.   Providers of very large online_platforms or of very large online_search_engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online_platform or of the very large online_search_engine to monitor the compliance of that provider with this Regulation.

2.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that compliance officers have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3.

The management body of the provider of the very large online_platform or of the very large online_search_engine shall ensure that the head of the compliance function is an independent senior manager with distinct responsibility for the compliance function.

The head of the compliance function shall report directly to the management body of the provider of the very large online_platform or of the very large online_search_engine, and may raise concerns and warn that body where risks referred to in Article 34 or non-compliance with this Regulation affect or may affect the provider of the very large online_platform or of the very large online_search_engine concerned, without prejudice to the responsibilities of the management body in its supervisory and managerial functions.

The head of the compliance function shall not be removed without prior approval of the management body of the provider of the very large online_platform or of the very large online_search_engine.

3.   Compliance officers shall have the following tasks:

(a)

cooperating with the Digital_Services_Coordinator_of_establishment and the Commission for the purpose of this Regulation;

(b)

ensuring that all risks referred to in Article 34 are identified and properly reported on and that reasonable, proportionate and effective risk-mitigation measures are taken pursuant to Article 35;

(c)

organising and supervising the activities of the provider of the very large online_platform or of the very large online_search_engine relating to the independent audit pursuant to Article 37;

(d)

informing and advising the management and employees of the provider of the very large online_platform or of the very large online_search_engine about relevant obligations under this Regulation;

(e)

monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with its obligations under this Regulation;

(f)

where applicable, monitoring the compliance of the provider of the very large online_platform or of the very large online_search_engine with commitments made under the codes of conduct pursuant to Articles 45 and 46 or the crisis protocols pursuant to Article 48.

4.   Providers of very large online_platforms or of very large online_search_engines shall communicate the name and contact details of the head of the compliance function to the Digital_Services_Coordinator_of_establishment and to the Commission.

5.   The management body of the provider of the very large online_platform or of the very large online_search_engine shall define, oversee and be accountable for the implementation of the provider's governance arrangements that ensure the independence of the compliance function, including the division of responsibilities within the organisation of the provider of very large online_platform or of very large online_search_engine, the prevention of conflicts of interest, and sound management of systemic risks identified pursuant to Article 34.

6.   The management body shall approve and review periodically, at least once a year, the strategies and policies for taking up, managing, monitoring and mitigating the risks identified pursuant to Article 34 to which the very large online_platform or the very large online_search_engine is or might be exposed to.

7.   The management body shall devote sufficient time to the consideration of the measures related to risk management. It shall be actively involved in the decisions related to risk management, and shall ensure that adequate resources are allocated to the management of the risks identified in accordance with Article 34.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

implementation, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators

Article 63

Tasks of the Board

1.   Where necessary to meet the objectives set out in Article 61(2), the Board shall in particular:

(a)

support the coordination of joint investigations;

(b)

support the competent authorities in the analysis of reports and results of audits of very large online_platforms or of very large online_search_engines to be transmitted pursuant to this Regulation;

(c)

issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation, taking into account, in particular, the freedom to provide services of the providers of intermediary_service;

(d)

advise the Commission on the measures referred to in Article 66 and, adopt opinions concerning very large online_platforms or very large online_search_engines in accordance with this Regulation;

(e)

support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in cooperation with relevant stakeholders as provided for in this Regulation, including by issuing opinions or recommendations on matters related to Article 44, as well as the identification of emerging issues, with regard to matters covered by this Regulation.

2.   Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.

SECTION 4

Supervision, investigation, enforcement and monitoring in respect of providers of very large online_platforms and of very large online_search_engines

Article 72

Monitoring actions

1.   For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by providers of the very large online_platform and of the very large online_search_engines. The Commission may order them to provide access to, and explanations relating to, its databases and algorithms. Such actions may include, imposing an obligation on the provider of the very large online_platform or of the very large online_search_engine to retain all documents deemed to be necessary to assess the implementation of and compliance with the obligations under this Regulation.

2.   The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors, as well as experts and auditors from competent national authorities with the agreement of the authority concerned, to assist the Commission in monitoring the effective implementation and compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the Commission.

Article 73

Non-compliance

1.   The Commission shall adopt a non-compliance decision where it finds that the provider of the very large online_platform or of the very large online_search_engine concerned does not comply with one or more of the following:

(a)

the relevant provisions of this Regulation;

(b)

interim measures ordered pursuant to Article 70;

(c)

commitments made binding pursuant to Article 71.

2.   Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the provider of the very large online_platform or of the very large online_search_engine concerned. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the provider of the very large online_platform or of the very large online_search_engine concerned should take, in order to effectively address the preliminary findings.

3.   In the decision adopted pursuant to paragraph 1 the Commission shall order the provider of the very large online_platform or of the very large online_search_engine concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable period specified therein and to provide information on the measures that that provider intends to take to comply with the decision.

4.   The provider of the very large online_platform or of the very large online_search_engine concerned shall provide the Commission with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation.

5.   Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. The decision shall apply with immediate effect.

Article 75

Enhanced supervision of remedies to address infringements of obligations laid down in Section 5 of Chapter III

1.   When adopting a decision pursuant to Article 73 in relation to an infringement by a provider of a very large online_platform or of a very large online_search_engine of any of the provisions of Section 5 of Chapter III, the Commission shall make use of the enhanced supervision system laid down in this Article. When doing so, it shall take utmost account of any opinion of the Board pursuant to this Article.

2.   In the decision referred to in Article 73, the Commission shall require the provider of a very large online_platform or of a very large online_search_engine concerned to draw up and communicate, within a reasonable period specified in the decision, to the Digital Services Coordinators, the Commission and the Board an action plan setting out the necessary measures which are sufficient to terminate or remedy the infringement. Those measures shall include a commitment to perform an independent audit in accordance with Article 37(3) and (4) on the implementation of the other measures, and shall specify the identity of the auditors, as well as the methodology, timing and follow-up of the audit. The measures may also include, where appropriate, a commitment to participate in a relevant code of conduct, as provided for in Article 45.

3.   Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Commission. Within one month following receipt of that opinion, the Commission shall decide whether the measures set out in the action plan are sufficient to terminate or remedy the infringement, and shall set a reasonable period for its implementation. The possible commitment to adhere to relevant codes of conduct shall be taken into account in that decision. The Commission shall subsequently monitor the implementation of the action plan. To that end, the provider of a very large online_platform or of a very large online_search_engine concerned shall communicate the audit report to the Commission without undue delay after it becomes available, and shall keep the Commission up to date on steps taken to implement the action plan. The Commission may, where necessary for such monitoring, require the provider of a very large online_platform or of a very large online_search_engine concerned to provide additional information within a reasonable period set by the Commission.

The Commission shall keep the Board and the Digital Services Coordinators informed about the implementation of the action plan, and about its monitoring thereof.

4.   The Commission may take necessary measures in accordance with this Regulation, in particular Article 76(1), point (e), and Article 82(1), where:

(a)

the provider of the very large online_platform or of the very large online_search_engine concerned fails to provide any action plan, the audit report, the necessary updates or any additional information required, within the applicable period;

(b)

the Commission rejects the proposed action plan because it considers that the measures set out therein are insufficient to terminate or remedy the infringement; or

(c)

the Commission considers, on the basis of the audit report, any updates or additional information provided or any other relevant information available to it, that the implementation of the action plan is insufficient to terminate or remedy the infringement.


whereas









keyboard_arrow_down