search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'respect' . Output generated live by software developed by IusOnDemand srl


expand index respect:


whereas respect:


definitions:


cloud tag: and the number of total unique words without stopwords is: 1689

 

Article 2

Scope

1.   This Regulation shall apply to intermediary_services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary_services have their place of establishment.

2.   This Regulation shall not apply to any service that is not an intermediary_service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary_service.

3.   This Regulation shall not affect the application of Directive 2000/31/EC.

4.   This Regulation is without prejudice to the rules laid down by other Union legal acts regulating other aspects of the provision of intermediary_services in the internal market or specifying and complementing this Regulation, in particular, the following:

(a)

Directive 2010/13/EU;

(b)

Union law on copyright and related rights;

(c)

Regulation (EU) 2021/784;

(d)

Regulation (EU) 2019/1148;

(e)

Regulation (EU) 2019/1150;

(f)

Union law on consumer protection and product safety, including Regulations (EU) 2017/2394 and (EU) 2019/1020 and Directives 2001/95/EC and 2013/11/EU;

(g)

Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC;

(h)

Union law in the field of judicial cooperation in civil matters, in particular Regulation (EU) No 1215/2012 or any Union legal act laying down the rules on law applicable to contractual and non-contractual obligations;

(i)

Union law in the field of judicial cooperation in criminal matters, in particular a Regulation on European Production and Preservation Orders for electronic evidence in criminal matters;

(j)

a Directive laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings.

Article 3

Definitions

For the purpose of this Regulation, the following definitions shall apply:

(a)

information_society_service’ means a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

(b)

recipient_of_the_service’ means any natural or legal person who uses an intermediary_service, in particular for the purposes of seeking information or making it accessible;

(c)

consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

(d)

to_offer_services_in_the_Union’ means enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary_services that has a substantial_connection_to_the_Union;

(e)

substantial_connection_to_the_Union’ means a connection of a provider of intermediary_services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as:

a significant number of recipients of the service in one or more Member States in relation to its or their population; or

the targeting of activities towards one or more Member States;

(f)

trader’ means any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(g)

intermediary_service’ means one of the following information_society_services:

(i)

a ‘ mere_conduit’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network;

(ii)

a ‘ caching’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

(iii)

a ‘ hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient_of_the_service;

(h)

illegal_content’ means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(i)

online_platform’ means a hosting service that, at the request of a recipient_of_the_service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(j)

online_search_engine’ means an intermediary_service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(k)

dissemination_to_the_public’ means making information available, at the request of the recipient_of_the_service who provided the information, to a potentially unlimited number of third parties;

(l)

distance_contract’ means ‘ distance_contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

(m)

online_interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

(n)

Digital_Services_Coordinator_of_establishment’ means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary_service is located or its legal representative resides or is established;

(o)

Digital_Services_Coordinator_of_destination’ means the Digital Services Coordinator of a Member State where the intermediary_service is provided;

(p)

‘active recipient of an online_platform’ means a recipient_of_the_service that has engaged with an online_platform by either requesting the online_platform to host information or being exposed to information hosted by the online_platform and disseminated through its online_interface;

(q)

‘active recipient of an online_search_engine’ means a recipient_of_the_service that has submitted a query to an online_search_engine and been exposed to information indexed and presented on its online_interface;

(r)

advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online_platform on its online_interface against remuneration specifically for promoting that information;

(s)

recommender_system’ means a fully or partially automated system used by an online_platform to suggest in its online_interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient_of_the_service or otherwise determining the relative order or prominence of information displayed;

(t)

content_moderation’ means the activities, whether automated or not, undertaken by providers of intermediary_services, that are aimed, in particular, at detecting, identifying and addressing illegal_content or information incompatible with their terms_and_conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal_content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

(u)

terms_and_conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary_services and the recipients of the service;

(v)

persons_with_disabilities’ means ‘ persons_with_disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council (38);

(w)

commercial_communication’ means ‘ commercial_communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

(x)

turnover’ means the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004 (39).

CHAPTER II

LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

Article 6

Hosting

1.   Where an information_society_service is provided that consists of the storage of information provided by a recipient_of_the_service, the service provider shall not be liable for the information stored at the request of a recipient_of_the_service, on condition that the provider:

(a)

does not have actual knowledge of illegal activity or illegal_content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal_content is apparent; or

(b)

upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal_content.

2.   Paragraph 1 shall not apply where the recipient_of_the_service is acting under the authority or the control of the provider.

3.   Paragraph 1 shall not apply with respect to the liability under consumer protection law of online_platforms that allow consumers to conclude distance_contracts with traders, where such an online_platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online_platform itself or by a recipient_of_the_service who is acting under its authority or control.

4.   This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State's legal system, to require the service provider to terminate or prevent an infringement.

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms_and_conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms_and_conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms_and_conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 16

Notice and action mechanisms

1.   Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal_content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.

2.   The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:

(a)

a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal_content;

(b)

a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal_content adapted to the type of content and to the specific type of hosting service;

(c)

the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;

(d)

a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.   Notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

4.   Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.

5.   The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.

6.   Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.

Article 17

Statement of reasons

1.   Providers of hosting services shall provide a clear and specific statement of reasons to any affected recipients of the service for any of the following restrictions imposed on the ground that the information provided by the recipient_of_the_service is illegal_content or incompatible with their terms_and_conditions:

(a)

any restrictions of the visibility of specific items of information provided by the recipient_of_the_service, including removal of content, disabling access to content, or demoting content;

(b)

suspension, termination or other restriction of monetary payments;

(c)

suspension or termination of the provision of the service in whole or in part;

(d)

suspension or termination of the recipient_of_the_service's account.

2.   Paragraph 1 shall only apply where the relevant electronic contact details are known to the provider. It shall apply at the latest from the date that the restriction is imposed, regardless of why or how it was imposed.

Paragraph 1 shall not apply where the information is deceptive high-volume commercial content.

3.   The statement of reasons referred to in paragraph 1 shall at least contain the following information:

(a)

information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the information, or the suspension or termination of monetary payments related to that information, or imposes other measures referred to in paragraph 1 with regard to the information, and, where relevant, the territorial scope of the decision and its duration;

(b)

the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted in accordance with Article 16 or based on voluntary own-initiative investigations and, where strictly necessary, the identity of the notifier;

(c)

where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of content detected or identified using automated means;

(d)

where the decision concerns allegedly illegal_content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal_content on that ground;

(e)

where the decision is based on the alleged incompatibility of the information with the terms_and_conditions of the provider of hosting services, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;

(f)

clear and user-friendly information on the possibilities for redress available to the recipient_of_the_service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.

4.   The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient_of_the_service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).

5.   This Article shall not apply to any orders referred to in Article 9.

Article 19

Exclusion for micro and small enterprises

1.   This Section, with the exception of Article 24(3) thereof, shall not apply to providers of online_platforms that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC.

This Section, with the exception of Article 24(3) thereof, shall not apply to providers of online_platforms that previously qualified for the status of a micro or small enterprise as defined in Recommendation 2003/361/EC during the 12 months following their loss of that status pursuant to Article 4(2) thereof, except when they are very large online_platforms in accordance with Article 33.

2.   By derogation from paragraph 1 of this Article, this Section shall apply to providers of online_platforms that have been designated as very large online_platforms in accordance with Article 33, irrespective of whether they qualify as micro or small enterprises.

Article 20

Internal complaint-handling system

1.   Providers of online_platforms shall provide recipients of the service, including individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, with access to an effective internal complaint-handling system that enables them to lodge complaints, electronically and free of charge, against the decision taken by the provider of the online_platform upon the receipt of a notice or against the following decisions taken by the provider of the online_platform on the grounds that the information provided by the recipients constitutes illegal_content or is incompatible with its terms_and_conditions:

(a)

decisions whether or not to remove or disable access to or restrict visibility of the information;

(b)

decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;

(c)

decisions whether or not to suspend or terminate the recipients’ account;

(d)

decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients.

2.   The period of at least six months referred to in paragraph 1 of this Article shall start on the day on which the recipient_of_the_service is informed about the decision in accordance with Article 16(5) or Article 17.

3.   Providers of online_platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.

4.   Providers of online_platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner. Where a complaint contains sufficient grounds for the provider of the online_platform to consider that its decision not to act upon the notice is unfounded or that the information to which the complaint relates is not illegal and is not incompatible with its terms_and_conditions, or contains information indicating that the complainant’s conduct does not warrant the measure taken, it shall reverse its decision referred to in paragraph 1 without undue delay.

5.   Providers of online_platforms shall inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 and other available possibilities for redress.

6.   Providers of online_platforms shall ensure that the decisions, referred to in paragraph 5, are taken under the supervision of appropriately qualified staff, and not solely on the basis of automated means.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 29

Exclusion for micro and small enterprises

1.   This Section shall not apply to providers of online_platforms allowing consumers to conclude distance_contracts with traders that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC.

This Section shall not apply to providers of online_platforms allowing consumers to conclude distance_contracts with traders that previously qualified for the status of a micro or small enterprise as defined in Recommendation 2003/361/EC during the 12 months following their loss of that status pursuant to Article 4(2) thereof, except when they are very large online_platforms in accordance with Article 33.

2.   By derogation from paragraph 1 of this Article, this Section shall apply to providers of online_platforms allowing consumers to conclude distance_contracts with traders that have been designated as very large online_platforms in accordance with Article 33, irrespective of whether they qualify as micro or small enterprises.

Article 32

Right to information

1.   Where a provider of an online_platform allowing consumers to conclude distance_contracts with traders becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider shall inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following:

(a)

the fact that the product or service is illegal;

(b)

the identity of the trader; and

(c)

any relevant means of redress.

The obligation laid down in the first subparagraph shall be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality.

2.   Where, in the situation referred to in paragraph 1, the provider of the online_platform allowing consumers to conclude distance_contracts with traders does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online_interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress.

SECTION 5

Additional obligations for providers of very large online_platforms and of very large online_search_engines to manage systemic risks

Article 33

Very large online_platforms and very large online_search_engines

1.   This Section shall apply to online_platforms and online_search_engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online_platforms or very large online_search_engines pursuant to paragraph 4.

2.   The Commission shall adopt delegated acts in accordance with Article 87 to adjust the number of average monthly active recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least by 5 % in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it shall adjust the number so that it corresponds to 10 % of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions.

3.   The Commission may adopt delegated acts in accordance with Article 87, after consulting the Board, to supplement the provisions of this Regulation by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1 of this Article and Article 24(2), ensuring that the methodology takes account of market and technological developments.

4.   The Commission shall, after having consulted the Member State of establishment or after taking into account the information provided by the Digital_Services_Coordinator_of_establishment pursuant to Article 24(4), adopt a decision designating as a very large online_platform or a very large online_search_engine for the purposes of this Regulation the online_platform or the online_search_engine which has a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1 of this Article. The Commission shall take its decision on the basis of data reported by the provider of the online_platform or of the online_search_engine pursuant to Article 24(2), or information requested pursuant to Article 24(3) or any other information available to the Commission.

The failure by the provider of the online_platform or of the online_search_engine to comply with Article 24(2) or to comply with the request by the Digital_Services_Coordinator_of_establishment or by the Commission pursuant to Article 24(3) shall not prevent the Commission from designating that provider as a provider of a very large online_platform or of a very large online_search_engine pursuant to this paragraph.

Where the Commission bases its decision on other information available to the Commission pursuant to the first subparagraph of this paragraph or on the basis of additional information requested pursuant to Article 24(3), the Commission shall give the provider of the online_platform or of the online_search_engine concerned 10 working days in which to submit its views on the Commission’s preliminary findings and on its intention to designate the online_platform or the online_search_engine as a very large online_platform or as a very large online_search_engine, respectively. The Commission shall take due account of the views submitted by the provider concerned.

The failure of the provider of the online_platform or of the online_search_engine concerned to submit its views pursuant to the third subparagraph shall not prevent the Commission from designating that online_platform or that online_search_engine as a very large online_platform or as a very large online_search_engine, respectively, based on other information available to it.

5.   The Commission shall terminate the designation if, during an uninterrupted period of one year, the online_platform or the online_search_engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in paragraph 1.

6.   The Commission shall notify its decisions pursuant to paragraphs 4 and 5, without undue delay, to the provider of the online_platform or of the online_search_engine concerned, to the Board and to the Digital_Services_Coordinator_of_establishment.

The Commission shall ensure that the list of designated very large online_platforms and very large online_search_engines is published in the Official Journal of the European Union, and shall keep that list up to date. The obligations set out in this Section shall apply, or cease to apply, to the very large online_platforms and very large online_search_engines concerned from four months after the notification to the provider concerned referred to in the first subparagraph.

Article 34

Risk assessment

1.   Providers of very large online_platforms and of very large online_search_engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.

They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

(a)

the dissemination of illegal_content through their services;

(b)

any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;

(c)

any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

(d)

any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

2.   When conducting risk assessments, providers of very large online_platforms and of very large online_search_engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:

(a)

the design of their recommender_systems and any other relevant algorithmic system;

(b)

their content_moderation systems;

(c)

the applicable terms_and_conditions and their enforcement;

(d)

systems for selecting and presenting advertisements;

(e)

data related practices of the provider.

The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal_content and of information that is incompatible with their terms_and_conditions.

The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

3.   Providers of very large online_platforms and of very large online_search_engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital_Services_Coordinator_of_establishment.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 36

Crisis response mechanism

1.   Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online_platforms or of very large online_search_engines to take one or more of the following actions:

(a)

assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;

(b)

identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

(c)

report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.

When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

2.   For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

3.   When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:

(a)

the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;

(b)

the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;

(c)

the actions required by the decision are limited to a period not exceeding three months.

4.   After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:

(a)

notify the decision to the provider or providers to which the decision is addressed;

(b)

make the decision publicly available; and

(c)

inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.

5.   The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.

6.   The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).

7.   The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.

Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.

8.   Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:

(a)

revoking the decision and, where appropriate, requiring the very large online_platform or very large online_search_engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;

(b)

extending the period referred to paragraph 3, point (c), by a period of no more than three months;

(c)

taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

9.   The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.

10.   The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.

11.   The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 40

Data access and scrutiny

1.   Providers of very large online_platforms or of very large online_search_engines shall provide the Digital_Services_Coordinator_of_establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation.

2.   Digital Services Coordinators and the Commission shall use the data accessed pursuant to paragraph 1 only for the purpose of monitoring and assessing compliance with this Regulation and shall take due account of the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of personal data, the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

3.   For the purposes of paragraph 1, providers of very large online_platforms or of very large online_search_engines shall, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender_systems.

4.   Upon a reasoned request from the Digital_Services_Coordinator_of_establishment, providers of very large online_platforms or of very large online_search_engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.

5.   Within 15 days following receipt of a request as referred to in paragraph 4, providers of very large online_platforms or of very large online_search_engines may request the Digital_Services_Coordinator_of_establishment, to amend the request, where they consider that they are unable to give access to the data requested because one of following two reasons:

(a)

they do not have access to the data;

(b)

giving access to the data will lead to significant vulnerabilities in the security of their service or the protection of confidential information, in particular trade secrets.

6.   Requests for amendment pursuant to paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request.

The Digital_Services_Coordinator_of_establishment shall decide on the request for amendment within 15 days and communicate to the provider of the very large online_platform or of the very large online_search_engine its decision and, where relevant, the amended request and the new period to comply with the request.

7.   Providers of very large online_platforms or of very large online_search_engines shall facilitate and provide access to data pursuant to paragraphs 1 and 4 through appropriate interfaces specified in the request, including online databases or application programming interfaces.

8.   Upon a duly substantiated application from researchers, the Digital_Services_Coordinator_of_establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online_platform or of very large online_search_engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:

(a)

they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;

(b)

they are independent from commercial interests;

(c)

their application discloses the funding of the research;

(d)

they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;

(e)

their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in paragraph 4;

(f)

the planned research activities will be carried out for the purposes laid down in paragraph 4;

(g)

they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679.

Upon receipt of the application pursuant to this paragraph, the Digital_Services_Coordinator_of_establishment shall inform the Commission and the Board.

9.   Researchers may also submit their application to the Digital Services Coordinator of the Member State of the research organisation to which they are affiliated. Upon receipt of the application pursuant to this paragraph the Digital Services Coordinator shall conduct an initial assessment as to whether the respective researchers meet all of the conditions set out in paragraph 8. The respective Digital Services Coordinator shall subsequently send the application, together with the supporting documents submitted by the respective researchers and the initial assessment, to the Digital_Services_Coordinator_of_establishment. The Digital_Services_Coordinator_of_establishment shall take a decision whether to award a researcher the status of ‘vetted researcher’ without undue delay.

While taking due account of the initial assessment provided, the final decision to award a researcher the status of ‘vetted researcher’ lies within the competence of Digital_Services_Coordinator_of_establishment, pursuant to paragraph 8.

10.   The Digital Services Coordinator that awarded the status of vetted researcher and issued the reasoned request for data access to the providers of very large online_platforms or of very large online_search_engines in favour of a vetted researcher shall issue a decision terminating the access if it determines, following an investigation either on its own initiative or on the basis of information received from third parties, that the vetted researcher no longer meets the conditions set out in paragraph 8, and shall inform the provider of the very large online_platform or of the very large online_search_engine concerned of the decision. Before terminating the access, the Digital Services Coordinator shall allow the vetted researcher to react to the findings of its investigation and to its intention to terminate the access.

11.   Digital Services Coordinators of establishment shall communicate to the Board the names and contact information of the natural persons or entities to which they have awarded the status of ‘vetted researcher’ in accordance with paragraph 8, as well as the purpose of the research in respect of which the application was made or, where they have terminated the access to the data in accordance with paragraph 10, communicate that information to the Board.

12.   Providers of very large online_platforms or of very large online_search_engines shall give access without undue delay to data, including, where technically possible, to real-time data, provided that the data is publicly accessible in their online_interface by researchers, including those affiliated to not for profit bodies, organisations and associations, who comply with the conditions set out in paragraph 8, points (b), (c), (d) and (e), and who use the data solely for performing research that contributes to the detection, identification and understanding of systemic risks in the Union pursuant to Article 34(1).

13.   The Commission shall, after consulting the Board, adopt delegated acts supplementing this Regulation by laying down the technical conditions under which providers of very large online_platforms or of very large online_search_engines are to share data pursuant to paragraphs 1 and 4 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with researchers can take place in compliance with Regulation (EU) 2016/679, as well as relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data, taking into account the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 43

Supervisory fee

1.   The Commission shall charge providers of very large online_platforms and of very large online_search_engines an annual supervisory fee upon their designation pursuant to Article 33.

2.   The overall amount of the annual supervisory fees shall cover the estimated costs that the Commission incurs in relation to its supervisory tasks under this Regulation, in particular costs related to the designation pursuant to Article 33, to the set-up, maintenance and operation of the database pursuant to Article 24(5) and to the information sharing system pursuant to Article 85, to referrals pursuant to Article 59, to supporting the Board pursuant to Article 62 and to the supervisory tasks pursuant to Article 56 and Section 4 of Chapter IV.

3.   The providers of very large online_platforms and of very large online_search_engines shall be charged annually a supervisory fee for each service for which they have been designated pursuant to Article 33.

The Commission shall adopt implementing acts establishing the amount of the annual supervisory fee in respect of each provider of very large online_platform or of very large online_search_engine. When adopting those implementing acts, the Commission shall apply the methodology laid down in the delegated act referred to in paragraph 4 of this Article and shall respect the principles set out in paragraph 5 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

4.   The Commission shall adopt delegated acts, in accordance with Article 87, laying down the detailed methodology and procedures for:

(a)

the determination of the estimated costs referred to in paragraph 2;

(b)

the determination of the individual annual supervisory fees referred to in paragraph 5, points (b) and (c);

(c)

the determination of the maximum overall limit defined in paragraph 5, point (c); and

(d)

the detailed arrangements necessary to make payments.

When adopting those delegated acts, the Commission shall respect the principles set out in paragraph 5 of this Article.

5.   The implementing act referred to in paragraph 3 and the delegated act referred to in paragraph 4 shall respect the following principles:

(a)

the estimation of the overall amount of the annual supervisory fee takes into account the costs incurred in the previous year;

(b)

the annual supervisory fee is proportionate to the number of average monthly active recipients in the Union of each very large online_platform or each very large online_search_engine designated pursuant to Article 33;

(c)

the overall amount of the annual supervisory fee charged on a given provider of very large online_platform or very large search engine does not, in any case, exceed 0,05 % of its worldwide annual net income in the preceding financial year.

6.   The individual annual supervisory fees charged pursuant to paragraph 1 of this Article shall constitute external assigned revenue in accordance with Article 21(5) of Regulation (EU, Euratom) 2018/1046 of the European Parliament and of the Council (41).

7.   The Commission shall report annually to the European Parliament and to the Council on the overall amount of the costs incurred for the fulfilment of the tasks under this Regulation and the total amount of the individual annual supervisory fees charged in the preceding year.

SECTION 6

Other provisions concerning due diligence obligations

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 45

Codes of conduct

1.   The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal_content and systemic risks, in accordance with Union law in particular on competition and the protection of personal data.

2.   Where significant systemic risk within the meaning of Article 34(1) emerge and concern several very large online_platforms or very large online_search_engines, the Commission may invite the providers of very large online_platforms concerned or the providers of very large online_search_engines concerned, and other providers of very large online_platforms, of very large online_search_engines, of online_platforms and of other intermediary_services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.

3.   When giving effect to paragraphs 1 and 2, the Commission and the Board, and where relevant other bodies, shall aim to ensure that the codes of conduct clearly set out their specific objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Services Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants.

4.   The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives, having regard to the key performance indicators that they might contain. They shall publish their conclusions.

The Commission and the Board shall also encourage and facilitate regular review and adaptation of the codes of conduct.

In the case of systematic failure to comply with the codes of conduct, the Commission and the Board may invite the signatories to the codes of conduct to take the necessary action.

Article 46

Codes of conduct for online advertising

1.   The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level by providers of online_platforms and other relevant service providers, such as providers of online advertising intermediary_services, other actors involved in the programmatic advertising value chain, or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for actors in the online advertising value chain beyond the requirements of Articles 26 and 39.

2.   The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information that fully respects the rights and interests of all parties involved, as well as a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct at least address the following:

(a)

the transmission of information held by providers of online advertising intermediaries to recipients of the service concerning the requirements set in Article 26(1), points (b), (c) and (d);

(b)

the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 39;

(c)

meaningful information on data monetisation.

3.   The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.

4.   The Commission shall encourage all the actors in the online advertising value chain referred to in paragraph 1 to endorse the commitments stated in the codes of conduct, and to comply with them.

Article 49

Competent authorities and Digital Services Coordinators

1.   Member States shall designate one or more competent authorities to be responsible for the supervision of providers of intermediary_services and enforcement of this Regulation (‘competent authorities’).

2.   Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to supervision and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent supervision and enforcement of this Regulation throughout the Union.

For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States to provide for cooperation mechanisms and regular exchanges of views between the Digital Services Coordinator and other national authorities where relevant for the performance of their respective tasks.

Where a Member State designates one or more competent authorities in addition to the Digital Services Coordinator, it shall ensure that the respective tasks of those authorities and of the Digital Services Coordinator are clearly defined and that they cooperate closely and effectively when performing their tasks.

3.   Member States shall designate the Digital Services Coordinators by 17 February 2024.

Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. The Member State concerned shall communicate to the Commission and the Board the name of the other competent authorities referred to in paragraph 2, as well as their respective tasks.

4.   The provisions applicable to Digital Services Coordinators set out in Articles 50, 51 and 56 shall also apply to any other competent authorities that the Member States designate pursuant to paragraph 1 of this Article.

Article 51

Powers of Digital Services Coordinators

1.   Where needed in order to carry out their tasks under this Regulation, Digital Services Coordinators shall have the following powers of investigation, in respect of conduct by providers of intermediary_services falling within the competence of their Member State:

(a)

the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including organisations performing the audits referred to in Article 37 and Article 75(2), to provide such information without undue delay;

(b)

the power to carry out, or to request a judicial authority in their Member State to order, inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium;

(c)

the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers with their consent by any technical means.

2.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall have the following enforcement powers, in respect of providers of intermediary_services falling within the competence of their Member State:

(a)

the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding;

(b)

the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end, or to request a judicial authority in their Member State to do so;

(c)

the power to impose fines, or to request a judicial authority in their Member State to do so, in accordance with Article 52 for failure to comply with this Regulation, including with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(d)

the power to impose a periodic penalty payment, or to request a judicial authority in their Member State to do so, in accordance with Article 52 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this subparagraph or for failure to comply with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(e)

the power to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm.

As regards the first subparagraph, points (c) and (d), Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after providing those other persons in good time with all relevant information relating to such orders, including the applicable period, the fines or periodic payments that may be imposed for failure to comply and the possibilities for redress.

3.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall, in respect of providers of intermediary_services falling within the competence of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted and the infringement has not been remedied or is continuing and is causing serious harm which cannot be avoided through the exercise of other powers available under Union or national law, also have the power to take the following measures:

(a)

to require the management body of those providers, without undue delay, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;

(b)

where the Digital Services Coordinator considers that a provider of intermediary_services has not sufficiently complied with the requirements referred to in point (a), that the infringement has not been remedied or is continuing and is causing serious harm, and that that infringement entails a criminal offence involving a threat to the life or safety of persons, to request that the competent judicial authority of its Member State order the temporary restriction of access of recipients to the service concerned by the infringement or, only where that is not technically feasible, to the online_interface of the provider of intermediary_services on which the infringement takes place.

The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 82, prior to submitting the request referred to in the first subparagraph, point (b), of this paragraph invite interested parties to submit written observations within a period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider of intermediary_services, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.

The restriction of access shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority. The Digital Services Coordinator shall only extend the period where, having regard to the rights and interests of all parties affected by that restriction and all relevant circumstances, including any information that the provider of intermediary_services, the addressee or addressees and any other third party that demonstrated a legitimate interest may provide to it, it considers that both of the following conditions have been met:

(a)

the provider of intermediary_services has failed to take the necessary measures to terminate the infringement;

(b)

the temporary restriction does not unduly restrict access to lawful information by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist.

Where the Digital Services Coordinator considers that the conditions set out in the third subparagraph, points (a) and (b), have been met but it cannot further extend the period pursuant to the third subparagraph, it shall submit a new request to the competent judicial authority, as referred to in the first subparagraph, point (b).

4.   The powers listed in paragraphs 1, 2 and 3 shall be without prejudice to Section 3.

5.   The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary_services concerned where relevant.

6.   Member States shall lay down specific rules and procedures for the exercise of the powers pursuant to paragraphs 1, 2 and 3 and shall ensure that any exercise of those powers is subject to adequate safeguards laid down in the applicable national law in compliance with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties.

Article 54

Compensation

Recipients of the service shall have the right to seek, in accordance with Union and national law, compensation from providers of intermediary_services, in respect of any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.

Article 56

Competences

1.   The Member State in which the main establishment of the provider of intermediary_services is located shall have exclusive powers to supervise and enforce this Regulation, except for the powers provided for in paragraphs 2, 3 and 4.

2.   The Commission shall have exclusive powers to supervise and enforce Section 5 of Chapter III.

3.   The Commission shall have powers to supervise and enforce this Regulation, other than those laid down in Section 5 of Chapter III thereof, against providers of very large online_platforms and of very large online_search_engines.

4.   Where the Commission has not initiated proceedings for the same infringement, the Member State in which the main establishment of the provider of very large online_platform or of very large online_search_engine is located shall have powers to supervise and enforce the obligations under this Regulation, other than those laid down in Section 5 of Chapter III, with respect to those providers.

5.   Member States and the Commission shall supervise and enforce the provisions of this Regulation in close cooperation.

6.   Where a provider of intermediary_services does not have an establishment in the Union, the Member State where its legal representative resides or is established or the Commission shall have powers, as applicable, in accordance with paragraphs 1 and 4 of this Article, to supervise and enforce the relevant obligations under this Regulation.

7.   Where a provider of intermediary_services fails to appoint a legal representative in accordance with Article 13, all Member States and, in case of a provider of a very large online_platform or very large online_search_engine, the Commission shall have powers to supervise and enforce in accordance with this Article.

Where a Digital Services Coordinator intends to exercise its powers under this paragraph, it shall notify all other Digital Services Coordinators and the Commission, and ensure that the applicable safeguards afforded by the Charter are respected, in particular to avoid that the same conduct is sanctioned more than once for the infringement of the obligations laid down in this Regulation. Where the Commission intends to exercise its powers under this paragraph, it shall notify all other Digital Services Coordinators of that intention. Following the notification pursuant to this paragraph, other Member States shall not initiate proceedings for the same infringement as that referred to in the notification.

Article 57

Mutual assistance

1.   Digital Services Coordinators and the Commission shall cooperate closely and provide each other with mutual assistance in order to apply this Regulation in a consistent and efficient manner. Mutual assistance shall include, in particular, exchange of information in accordance with this Article and the duty of the Digital_Services_Coordinator_of_establishment to inform all Digital Services Coordinators of destination, the Board and the Commission about the opening of an investigation and the intention to take a final decision, including its assessment, in respect of a specific provider of intermediary_services.

2.   For the purpose of an investigation, the Digital_Services_Coordinator_of_establishment may request other Digital Services Coordinators to provide specific information in their possession as regards a specific provider of intermediary_services or to exercise their investigative powers referred to in Article 51(1) with regard to specific information located in their Member State. Where appropriate, the Digital Services Coordinator receiving the request may involve other competent authorities or other public authorities of the Member State in question.

3.   The Digital Services Coordinator receiving the request pursuant to paragraph 2 shall comply with such request and inform the Digital_Services_Coordinator_of_establishment about the action taken, without undue delay and no later than two months after its receipt, unless:

(a)

the scope or the subject matter of the request is not sufficiently specified, justified or proportionate in view of the investigative purposes; or

(b)

neither the requested Digital Service Coordinator nor other competent authority or other public authority of that Member State is in possession of the requested information nor can have access to it; or

(c)

the request cannot be complied with without infringing Union or national law.

The Digital Services Coordinator receiving the request shall justify its refusal by submitting a reasoned reply, within the period set out in the first subparagraph.

Article 60

Joint investigations

1.   The Digital_Services_Coordinator_of_establishment may launch and lead joint investigations with the participation of one or more other Digital Services Coordinators concerned:

(a)

at its own initiative, to investigate an alleged infringement of this Regulation by a given provider of intermediary_services in several Member States; or

(b)

upon recommendation of the Board, acting on the request of at least three Digital Services Coordinators alleging, based on a reasonable suspicion, an infringement by a given provider of intermediary_services affecting recipients of the service in their Member States.

2.   Any Digital Services Coordinator that proves that it has a legitimate interest in participating in a joint investigation pursuant to paragraph 1 may request to do so. The joint investigation shall be concluded within three months from its launch, unless otherwise agreed amongst the participants.

The Digital_Services_Coordinator_of_establishment shall communicate its preliminary position on the alleged infringement no later than one month after the end of the deadline referred to in the first subparagraph to all Digital Services Coordinators, the Commission and the Board. The preliminary position shall take into account the views of all other Digital Services Coordinators participating in the joint investigation. Where applicable, this preliminary position shall also set out the enforcement measures envisaged.

3.   The Board may refer the matter to the Commission pursuant to Article 59, where:

(a)

the Digital_Services_Coordinator_of_establishment failed to communicate its preliminary position within the deadline set out in paragraph 2;

(b)

the Board substantially disagrees with the preliminary position communicated by the Digital_Services_Coordinator_of_establishment; or

(c)

the Digital_Services_Coordinator_of_establishment failed to initiate the joint investigation promptly following the recommendation by the Board pursuant to paragraph 1, point (b).

4.   In carrying out the joint investigation, the participating Digital Services Coordinators shall cooperate in good faith, taking into account, where applicable, the indications of the Digital_Services_Coordinator_of_establishment and the Board’s recommendation. The Digital Services Coordinators of destination participating in the joint investigation shall be entitled, at the request of or after having consulted the Digital_Services_Coordinator_of_establishment, to exercise their investigative powers referred to in Article 51(1) in respect of the providers of intermediary_services concerned by the alleged infringement, with regard to information and premises located within their territory.

SECTION 3

European Board for Digital Services

Article 63

Tasks of the Board

1.   Where necessary to meet the objectives set out in Article 61(2), the Board shall in particular:

(a)

support the coordination of joint investigations;

(b)

support the competent authorities in the analysis of reports and results of audits of very large online_platforms or of very large online_search_engines to be transmitted pursuant to this Regulation;

(c)

issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation, taking into account, in particular, the freedom to provide services of the providers of intermediary_service;

(d)

advise the Commission on the measures referred to in Article 66 and, adopt opinions concerning very large online_platforms or very large online_search_engines in accordance with this Regulation;

(e)

support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in cooperation with relevant stakeholders as provided for in this Regulation, including by issuing opinions or recommendations on matters related to Article 44, as well as the identification of emerging issues, with regard to matters covered by this Regulation.

2.   Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.

SECTION 4

Supervision, investigation, enforcement and monitoring in respect of providers of very large online_platforms and of very large online_search_engines

Article 64

Development of expertise and capabilities

1.   The Commission, in cooperation with the Digital Services Coordinators and the Board, shall develop Union expertise and capabilities, including, where appropriate, through the secondment of Member States’ personnel.

2.   In addition, the Commission, in cooperation with the Digital Services Coordinators and the Board, shall coordinate the assessment of systemic and emerging issues across the Union in relation to very large online_platforms or very large online_search_engines with regard to matters covered by this Regulation.

3.   The Commission may ask the Digital Services Coordinators, the Board and other Union bodies, offices and agencies with relevant expertise to support the assessment of systemic and emerging issues across the Union under this Regulation.

4.   Member States shall cooperate with the Commission, in particular through their respective Digital Services Coordinators and other competent authorities, where applicable, including by making available their expertise and capabilities.

Article 66

Initiation of proceedings by the Commission and cooperation in investigation

1.   The Commission may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 73 and 74 in respect of the relevant conduct by the provider of the very large online_platform or of the very large online_search_engine that the Commission suspect of having infringed any of the provisions of this Regulation.

2.   Where the Commission decides to initiate proceedings pursuant to paragraph 1 of this Article, it shall notify all Digital Services Coordinators and the Board through the information sharing system referred to in Article 85, as well as the provider of the very large online_platform or of the very large online_search_engine concerned.

The Digital Services Coordinators shall, without undue delay after being informed of initiation of the proceedings, transmit to the Commission any information they hold about the infringement at stake.

The initiation of proceedings pursuant to paragraph 1 of this Article by the Commission shall relieve the Digital Services Coordinator, or any competent authority where applicable, of its powers to supervise and enforce provided for in this Regulation pursuant to Article 56(4).

3.   In the exercise of its powers of investigation under this Regulation the Commission may request the individual or joint support of any Digital Services Coordinators concerned by the suspected infringement, including the Digital_Services_Coordinator_of_establishment. The Digital Services Coordinators that have received such a request, and, where involved by the Digital Services Coordinator, any other competent authority, shall cooperate sincerely and in a timely manner with the Commission and shall be entitled to exercise their investigative powers referred to in Article 51(1) in respect of the provider of the very large online_platform or of the very large online_search_engine at stake, with regard to information, persons and premises located within the territory of their Member State and in accordance with the request.

4.   The Commission shall provide the Digital_Services_Coordinator_of_establishment and the Board with all relevant information about the exercise of the powers referred to in Articles 67 to 72 and its preliminary findings referred to in Article 79(1). The Board shall submit its views on those preliminary findings to the Commission within the period set pursuant to Article 79(2). The Commission shall take utmost account of any views of the Board in its decision.

Article 69

Power to conduct inspections

1.   In order to carry out the tasks assigned to it under this Section, the Commission may conduct all necessary inspections at the premises of the provider of the very large online_platform or of the very large online_search_engine concerned or of another person referred to in Article 67(1).

2.   The officials and other accompanying persons authorised by the Commission to conduct an inspection shall be empowered to:

(a)

enter any premises, land and means of transport of the provider of the very large online_platform or of the very large online_search_engine concerned or of the other person concerned;

(b)

examine the books and other records related to the provision of the service concerned, irrespective of the medium on which they are stored;

(c)

take or obtain in any form copies of or extracts from such books or other records;

(d)

require the provider of the very large online_platform or of the very large online_search_engine or the other person concerned to provide access to and explanations on its organisation, functioning, IT system, algorithms, data-handling and business practices and to record or document the explanations given;

(e)

seal any premises used for purposes related to the trade, business, craft or profession of the provider of the very large online_platform or of the very large online_search_engine or of the other person concerned, as well as books or other records, for the period and to the extent necessary for the inspection;

(f)

ask any representative or member of staff of the provider of the very large online_platform or of the very large online_search_engine or the other person concerned for explanations on facts or documents relating to the subject-matter and purpose of the inspection and to record the answers;

(g)

address questions to any such representative or member of staff relating to the subject-matter and purpose of the inspection and to record the answers.

3.   Inspections may be carried out with the assistance of auditors or experts appointed by the Commission pursuant to Article 72(2), and of Digital Services Coordinator or other competent national authorities of the Member State in the territory of which the inspection is conducted.

4.   Where the production of required books or other records related to the provision of the service concerned is incomplete or where the answers to questions asked under paragraph 2 of this Article are incorrect, incomplete or misleading, the officials and other accompanying persons authorised by the Commission to conduct an inspection shall exercise their powers upon production of a written authorisation specifying the subject matter and purpose of the inspection and the penalties provided for in Articles 74 and 76. In good time before the inspection, the Commission shall inform the Digital Services Coordinator of the Member State in the territory in which the inspection is to be conducted thereof.

5.   During inspections, the officials and other accompanying persons authorised by the Commission, the auditors and experts appointed by the Commission, the Digital Services Coordinator or the other competent authorities of the Member State in the territory of which the inspection is conducted may require the provider of the very large online_platform or of the very large online_search_engine or other person concerned to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts, and may address questions to its key personnel.

6.   The provider of the very large online_platform or of the very large online_search_engine or other natural or legal person concerned shall be required to submit to an inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the inspection, set the date on which it is to begin and indicate the penalties provided for in Articles 74 and 76 and the right to have the decision reviewed by the Court of Justice of the European Union. The Commission shall consult the Digital Services Coordinator of the Member State on territory of which the inspection is to be conducted prior to taking that decision.

7.   Officials of, and other persons authorised or appointed by, the Digital Services Coordinator of the Member State on the territory of which the inspection is to be conducted shall, at the request of that Digital Services Coordinator or of the Commission, actively assist the officials and other accompanying persons authorised by the Commission in relation to the inspection. To this end, they shall have the powers listed in paragraph 2.

8.   Where the officials and other accompanying persons authorised by the Commission find that the provider of the very large online_platform or of the very large online_search_engine or the other person concerned opposes an inspection ordered pursuant to this Article, the Member State in the territory of which the inspection is to be conducted shall, at the request of those officials or other accompanying persons and in accordance with the national law of the Member State, afford them necessary assistance, including, where appropriate under that national law, in the form of coercive measures taken by a competent law enforcement authority, so as to enable them to conduct the inspection.

9.   If the assistance provided for in paragraph 8 requires authorisation from a national judicial authority in accordance with the national law of the Member State concerned, such authorisation shall be applied for by the Digital Services Coordinator of that Member State at the request of the officials and other accompanying persons authorised by the Commission. Such authorisation may also be applied for as a precautionary measure.

10.   Where the authorisation referred to in paragraph 9 is applied for, the national judicial authority before which a case has been brought shall verify that the Commission decision ordering the inspection is authentic and that the coercive measures envisaged are neither arbitrary nor excessive having regard to the subject matter of the inspection. When conducting such verification, the national judicial authority may ask the Commission, directly or through the Digital Services Coordinators of the Member State concerned, for detailed explanations, in particular those concerning the grounds on which the Commission suspects an infringement of this Regulation, concerning the seriousness of the suspected infringement and concerning the nature of the involvement of the provider of the very large online_platform or of the very large online_search_engine or of the other person concerned. However, the national judicial authority shall not call into question the necessity for the inspection nor demand information from the case file of the Commission. The lawfulness of the Commission decision shall be subject to review only by the Court of Justice of the European Union.

Article 77

Limitation period for the imposition of penalties

1.   The powers conferred on the Commission by Articles 74 and 76 shall be subject to a limitation period of five years.

2.   Time shall begin to run on the day on which the infringement is committed. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases.

3.   Any action taken by the Commission or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following:

(a)

requests for information by the Commission or by a Digital Services Coordinator;

(b)

inspection;

(c)

the opening of a proceeding by the Commission pursuant to Article 66(1).

4.   Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period has been suspended pursuant to paragraph 5.

5.   The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the Commission is the subject of proceedings pending before the Court of Justice of the European Union.

Article 79

Right to be heard and access to the file

1.   Before adopting a decision pursuant to Article 73(1), Article 74 or 76, the Commission shall give the provider of the very large online_platform or of the very large online_search_engine concerned or other person referred to in Article 67(1) the opportunity of being heard on:

(a)

preliminary findings of the Commission, including any matter to which the Commission has taken objections; and

(b)

measures that the Commission may intend to take in view of the preliminary findings referred to point (a).

2.   The provider of the very large online_platform or of the very large online_search_engine concerned or other person referred to in Article 67(1) may submit its observations on the Commission’s preliminary findings within a reasonable period set by the Commission in its preliminary findings, which may not be less than 14 days.

3.   The Commission shall base its decisions only on objections on which the parties concerned have been able to comment.

4.   The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the Commission's file under the terms of a negotiated disclosure, subject to the legitimate interest of the provider of the very large online_platform or of the very large online_search_engine or other person concerned in the protection of their business secrets. The Commission shall have the power to adopt decisions setting out such terms of disclosure in case of disagreement between the parties. The right of access to the file of the Commission shall not extend to confidential information and internal documents of the Commission, the Board, Digital Service Coordinators, other competent authorities or other public authorities of the Member States. In particular, the right of access shall not extend to correspondence between the Commission and those authorities. Nothing in this paragraph shall prevent the Commission from disclosing and using information necessary to prove an infringement.

5.   The information collected pursuant to Articles 67, 68 and 69 shall be used only for the purpose of this Regulation.

Article 84

Professional secrecy

Without prejudice to the exchange and to the use of information referred to in this Chapter, the Commission, the Board, Member States’ competent authorities and their respective officials, servants and other persons working under their supervision, and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 72(2), shall not disclose information acquired or exchanged by them pursuant to this Regulation and of the kind covered by the obligation of professional secrecy.

Article 87

Exercise of the delegation

1.   The power to adopt delegated acts is conferred on the Commission subject to the conditions laid down in this Article.

2.   The delegation of power referred to in Articles 24, 33, 37, 40 and 43 shall be conferred on the Commission for five years starting from 16 November 2022. The Commission shall draw up a report in respect of the delegation of power not later than nine months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than three months before the end of each period.

3.   The delegation of power referred to in Articles 24, 33, 37, 40 and 43 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.

4.   Before adopting a delegated act, the Commission shall consult experts designated by each Member State in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-Making.

5.   As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council.

6.   A delegated act adopted pursuant to Articles 24, 33, 37, 40 and 43 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.

Article 89

Amendments to Directive 2000/31/EC

1.   Articles 12 to 15 of Directive 2000/31/EC are deleted.

2.   References to Articles 12 to 15 of Directive 2000/31/EC shall be construed as references to Articles 4, 5, 6 and 8 of this Regulation, respectively.

Article 91

Review

1.   By 18 February 2027, the Commission shall evaluate and report to the European Parliament, the Council and the European Economic and Social Committee on the potential effect of this Regulation on the development and economic growth of small and medium-sized enterprises.

By 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council and the European Economic and Social Committee on:

(a)

the application of Article 33, including the scope of providers of intermediary_services covered by the obligations set out in Section 5 of Chapter III of this Regulation;

(b)

the way that this Regulation interacts with other legal acts, in particular the acts referred to in Article 2(3) and (4).

2.   By 17 November 2027, and every five years thereafter, the Commission shall evaluate this Regulation, and report to the European Parliament, the Council and the European Economic and Social Committee.

This report shall address in particular:

(a)

the application of paragraph 1, second subparagraph, points (a) and (b);

(b)

the contribution of this Regulation to the deepening and efficient functioning of the internal market for intermediary_services, in particular as regards the cross-border provision of digital services;

(c)

the application of Articles 13, 16, 20, 21, 45 and 46;

(d)

the scope of the obligations on small and micro enterprises;

(e)

the effectiveness of the supervision and enforcement mechanisms;

(f)

the impact on the respect for the right to freedom of expression and information.

3.   Where appropriate, the report referred to in paragraphs 1 and 2 shall be accompanied by a proposal for amendment of this Regulation.

4.   The Commission shall, in the report referred to in paragraph 2 of this Article, also evaluate and report on the annual reports on their activities by the Digital Services Coordinators provided to the Commission and the Board pursuant to Article 55(1).

5.   For the purpose of paragraph 2, Member States and the Board shall send information on the request of the Commission.

6.   In carrying out the evaluations referred to in paragraph 2, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources, and shall pay specific attention to small and medium-sized enterprises and the position of new competitors.

7.   By 18 February 2027, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and of the application of Article 43, and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking utmost account of the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board.


whereas









keyboard_arrow_down