search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'significant' . Output generated live by software developed by IusOnDemand srl


expand index significant:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES
  • 1 Art. 3 Definitions

  • CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services
  • 1 Art. 14 Terms and conditions

  • SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms
  • 1 Art. 22 Trusted flaggers
  • 1 Art. 27 Recommender system transparency

  • SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 2 Art. 36 Crisis response mechanism
  • 1 Art. 40 Data access and scrutiny
  • 1 Art. 42 Transparency reporting obligations

  • SECTION 6
    Other provisions concerning due diligence obligations
  • 1 Art. 45 Codes of conduct

  • CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas significant:


definitions:


cloud tag: and the number of total unique words without stopwords is: 836

 

Article 3

Definitions

For the purpose of this Regulation, the following definitions shall apply:

(a)

information_society_service’ means a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

(b)

recipient_of_the_service’ means any natural or legal person who uses an intermediary_service, in particular for the purposes of seeking information or making it accessible;

(c)

consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

(d)

to_offer_services_in_the_Union’ means enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary_services that has a substantial_connection_to_the_Union;

(e)

substantial_connection_to_the_Union’ means a connection of a provider of intermediary_services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as:

a significant number of recipients of the service in one or more Member States in relation to its or their population; or

the targeting of activities towards one or more Member States;

(f)

trader’ means any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(g)

intermediary_service’ means one of the following information_society_services:

(i)

a ‘ mere_conduit’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network;

(ii)

a ‘ caching’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

(iii)

a ‘ hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient_of_the_service;

(h)

illegal_content’ means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(i)

online_platform’ means a hosting service that, at the request of a recipient_of_the_service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(j)

online_search_engine’ means an intermediary_service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(k)

dissemination_to_the_public’ means making information available, at the request of the recipient_of_the_service who provided the information, to a potentially unlimited number of third parties;

(l)

distance_contract’ means ‘ distance_contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

(m)

online_interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

(n)

Digital_Services_Coordinator_of_establishment’ means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary_service is located or its legal representative resides or is established;

(o)

Digital_Services_Coordinator_of_destination’ means the Digital Services Coordinator of a Member State where the intermediary_service is provided;

(p)

‘active recipient of an online_platform’ means a recipient_of_the_service that has engaged with an online_platform by either requesting the online_platform to host information or being exposed to information hosted by the online_platform and disseminated through its online_interface;

(q)

‘active recipient of an online_search_engine’ means a recipient_of_the_service that has submitted a query to an online_search_engine and been exposed to information indexed and presented on its online_interface;

(r)

advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online_platform on its online_interface against remuneration specifically for promoting that information;

(s)

recommender_system’ means a fully or partially automated system used by an online_platform to suggest in its online_interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient_of_the_service or otherwise determining the relative order or prominence of information displayed;

(t)

content_moderation’ means the activities, whether automated or not, undertaken by providers of intermediary_services, that are aimed, in particular, at detecting, identifying and addressing illegal_content or information incompatible with their terms_and_conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal_content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

(u)

terms_and_conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary_services and the recipients of the service;

(v)

persons_with_disabilities’ means ‘ persons_with_disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council (38);

(w)

commercial_communication’ means ‘ commercial_communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

(x)

turnover’ means the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004 (39).

CHAPTER II

LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 22

Trusted flaggers

1.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.

2.   The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

(a)

it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal_content;

(b)

it is independent from any provider of online_platforms;

(c)

it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

3.   Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:

(a)

the identity of the provider of hosting services,

(b)

the type of allegedly illegal_content notified,

(c)

the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

4.   Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.

5.   The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.

6.   Where a provider of online_platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online_platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.

7.   The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online_platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.

8.   The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online_platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.

Article 27

Recommender system transparency

1.   Providers of online_platforms that use recommender_systems shall set out in their terms_and_conditions, in plain and intelligible language, the main parameters used in their recommender_systems, as well as any options for the recipients of the service to modify or influence those main parameters.

2.   The main parameters referred to in paragraph 1 shall explain why certain information is suggested to the recipient_of_the_service. They shall include, at least:

(a)

the criteria which are most significant in determining the information suggested to the recipient_of_the_service;

(b)

the reasons for the relative importance of those parameters.

3.   Where several options are available pursuant to paragraph 1 for recommender_systems that determine the relative order of information presented to recipients of the service, providers of online_platforms shall also make available a functionality that allows the recipient_of_the_service to select and to modify at any time their preferred option. That functionality shall be directly and easily accessible from the specific section of the online_platform’s online_interface where the information is being prioritised.

Article 36

Crisis response mechanism

1.   Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online_platforms or of very large online_search_engines to take one or more of the following actions:

(a)

assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;

(b)

identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

(c)

report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.

When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

2.   For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

3.   When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:

(a)

the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;

(b)

the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;

(c)

the actions required by the decision are limited to a period not exceeding three months.

4.   After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:

(a)

notify the decision to the provider or providers to which the decision is addressed;

(b)

make the decision publicly available; and

(c)

inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.

5.   The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.

6.   The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).

7.   The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.

Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.

8.   Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:

(a)

revoking the decision and, where appropriate, requiring the very large online_platform or very large online_search_engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;

(b)

extending the period referred to paragraph 3, point (c), by a period of no more than three months;

(c)

taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

9.   The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.

10.   The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.

11.   The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.

Article 40

Data access and scrutiny

1.   Providers of very large online_platforms or of very large online_search_engines shall provide the Digital_Services_Coordinator_of_establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation.

2.   Digital Services Coordinators and the Commission shall use the data accessed pursuant to paragraph 1 only for the purpose of monitoring and assessing compliance with this Regulation and shall take due account of the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of personal data, the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

3.   For the purposes of paragraph 1, providers of very large online_platforms or of very large online_search_engines shall, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender_systems.

4.   Upon a reasoned request from the Digital_Services_Coordinator_of_establishment, providers of very large online_platforms or of very large online_search_engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.

5.   Within 15 days following receipt of a request as referred to in paragraph 4, providers of very large online_platforms or of very large online_search_engines may request the Digital_Services_Coordinator_of_establishment, to amend the request, where they consider that they are unable to give access to the data requested because one of following two reasons:

(a)

they do not have access to the data;

(b)

giving access to the data will lead to significant vulnerabilities in the security of their service or the protection of confidential information, in particular trade secrets.

6.   Requests for amendment pursuant to paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request.

The Digital_Services_Coordinator_of_establishment shall decide on the request for amendment within 15 days and communicate to the provider of the very large online_platform or of the very large online_search_engine its decision and, where relevant, the amended request and the new period to comply with the request.

7.   Providers of very large online_platforms or of very large online_search_engines shall facilitate and provide access to data pursuant to paragraphs 1 and 4 through appropriate interfaces specified in the request, including online databases or application programming interfaces.

8.   Upon a duly substantiated application from researchers, the Digital_Services_Coordinator_of_establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online_platform or of very large online_search_engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:

(a)

they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;

(b)

they are independent from commercial interests;

(c)

their application discloses the funding of the research;

(d)

they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;

(e)

their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in paragraph 4;

(f)

the planned research activities will be carried out for the purposes laid down in paragraph 4;

(g)

they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679.

Upon receipt of the application pursuant to this paragraph, the Digital_Services_Coordinator_of_establishment shall inform the Commission and the Board.

9.   Researchers may also submit their application to the Digital Services Coordinator of the Member State of the research organisation to which they are affiliated. Upon receipt of the application pursuant to this paragraph the Digital Services Coordinator shall conduct an initial assessment as to whether the respective researchers meet all of the conditions set out in paragraph 8. The respective Digital Services Coordinator shall subsequently send the application, together with the supporting documents submitted by the respective researchers and the initial assessment, to the Digital_Services_Coordinator_of_establishment. The Digital_Services_Coordinator_of_establishment shall take a decision whether to award a researcher the status of ‘vetted researcher’ without undue delay.

While taking due account of the initial assessment provided, the final decision to award a researcher the status of ‘vetted researcher’ lies within the competence of Digital_Services_Coordinator_of_establishment, pursuant to paragraph 8.

10.   The Digital Services Coordinator that awarded the status of vetted researcher and issued the reasoned request for data access to the providers of very large online_platforms or of very large online_search_engines in favour of a vetted researcher shall issue a decision terminating the access if it determines, following an investigation either on its own initiative or on the basis of information received from third parties, that the vetted researcher no longer meets the conditions set out in paragraph 8, and shall inform the provider of the very large online_platform or of the very large online_search_engine concerned of the decision. Before terminating the access, the Digital Services Coordinator shall allow the vetted researcher to react to the findings of its investigation and to its intention to terminate the access.

11.   Digital Services Coordinators of establishment shall communicate to the Board the names and contact information of the natural persons or entities to which they have awarded the status of ‘vetted researcher’ in accordance with paragraph 8, as well as the purpose of the research in respect of which the application was made or, where they have terminated the access to the data in accordance with paragraph 10, communicate that information to the Board.

12.   Providers of very large online_platforms or of very large online_search_engines shall give access without undue delay to data, including, where technically possible, to real-time data, provided that the data is publicly accessible in their online_interface by researchers, including those affiliated to not for profit bodies, organisations and associations, who comply with the conditions set out in paragraph 8, points (b), (c), (d) and (e), and who use the data solely for performing research that contributes to the detection, identification and understanding of systemic risks in the Union pursuant to Article 34(1).

13.   The Commission shall, after consulting the Board, adopt delegated acts supplementing this Regulation by laying down the technical conditions under which providers of very large online_platforms or of very large online_search_engines are to share data pursuant to paragraphs 1 and 4 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with researchers can take place in compliance with Regulation (EU) 2016/679, as well as relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data, taking into account the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 45

Codes of conduct

1.   The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal_content and systemic risks, in accordance with Union law in particular on competition and the protection of personal data.

2.   Where significant systemic risk within the meaning of Article 34(1) emerge and concern several very large online_platforms or very large online_search_engines, the Commission may invite the providers of very large online_platforms concerned or the providers of very large online_search_engines concerned, and other providers of very large online_platforms, of very large online_search_engines, of online_platforms and of other intermediary_services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.

3.   When giving effect to paragraphs 1 and 2, the Commission and the Board, and where relevant other bodies, shall aim to ensure that the codes of conduct clearly set out their specific objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Services Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants.

4.   The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives, having regard to the key performance indicators that they might contain. They shall publish their conclusions.

The Commission and the Board shall also encourage and facilitate regular review and adaptation of the codes of conduct.

In the case of systematic failure to comply with the codes of conduct, the Commission and the Board may invite the signatories to the codes of conduct to take the necessary action.


whereas









keyboard_arrow_down