search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'voluntary' . Output generated live by software developed by IusOnDemand srl


expand index voluntary:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES
  • 1 Art. 7 voluntary own-initiative investigations and legal compliance

  • CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms
  • 1 Art. 17 Statement of reasons

  • SECTION 3
    Additional provisions applicable to providers of online platforms

    SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 1 Art. 37 Independent audit

  • SECTION 6
    Other provisions concerning due diligence obligations
  • 1 Art. 44 Standards
  • 1 Art. 45 Codes of conduct
  • 1 Art. 46 Codes of conduct for online advertising

  • CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators
  • 1 Art. 48 Crisis protocols

  • SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas voluntary:


definitions:


cloud tag: and the number of total unique words without stopwords is: 579

 

Article 7

voluntary own-initiative investigations and legal compliance

Providers of intermediary_services shall not be deemed ineligible for the exemptions from liability referred to in Articles 4, 5 and 6 solely because they, in good faith and in a diligent manner, carry out voluntary own-initiative investigations into, or take other measures aimed at detecting, identifying and removing, or disabling access to, illegal_content, or take the necessary measures to comply with the requirements of Union law and national law in compliance with Union law, including the requirements set out in this Regulation.

Article 17

Statement of reasons

1.   Providers of hosting services shall provide a clear and specific statement of reasons to any affected recipients of the service for any of the following restrictions imposed on the ground that the information provided by the recipient_of_the_service is illegal_content or incompatible with their terms_and_conditions:

(a)

any restrictions of the visibility of specific items of information provided by the recipient_of_the_service, including removal of content, disabling access to content, or demoting content;

(b)

suspension, termination or other restriction of monetary payments;

(c)

suspension or termination of the provision of the service in whole or in part;

(d)

suspension or termination of the recipient_of_the_service's account.

2.   Paragraph 1 shall only apply where the relevant electronic contact details are known to the provider. It shall apply at the latest from the date that the restriction is imposed, regardless of why or how it was imposed.

Paragraph 1 shall not apply where the information is deceptive high-volume commercial content.

3.   The statement of reasons referred to in paragraph 1 shall at least contain the following information:

(a)

information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the information, or the suspension or termination of monetary payments related to that information, or imposes other measures referred to in paragraph 1 with regard to the information, and, where relevant, the territorial scope of the decision and its duration;

(b)

the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted in accordance with Article 16 or based on voluntary own-initiative investigations and, where strictly necessary, the identity of the notifier;

(c)

where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of content detected or identified using automated means;

(d)

where the decision concerns allegedly illegal_content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal_content on that ground;

(e)

where the decision is based on the alleged incompatibility of the information with the terms_and_conditions of the provider of hosting services, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;

(f)

clear and user-friendly information on the possibilities for redress available to the recipient_of_the_service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.

4.   The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient_of_the_service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).

5.   This Article shall not apply to any orders referred to in Article 9.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 45

Codes of conduct

1.   The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal_content and systemic risks, in accordance with Union law in particular on competition and the protection of personal data.

2.   Where significant systemic risk within the meaning of Article 34(1) emerge and concern several very large online_platforms or very large online_search_engines, the Commission may invite the providers of very large online_platforms concerned or the providers of very large online_search_engines concerned, and other providers of very large online_platforms, of very large online_search_engines, of online_platforms and of other intermediary_services, as appropriate, as well as relevant competent authorities, civil society organisations and other relevant stakeholders, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes.

3.   When giving effect to paragraphs 1 and 2, the Commission and the Board, and where relevant other bodies, shall aim to ensure that the codes of conduct clearly set out their specific objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Services Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants.

4.   The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives, having regard to the key performance indicators that they might contain. They shall publish their conclusions.

The Commission and the Board shall also encourage and facilitate regular review and adaptation of the codes of conduct.

In the case of systematic failure to comply with the codes of conduct, the Commission and the Board may invite the signatories to the codes of conduct to take the necessary action.

Article 46

Codes of conduct for online advertising

1.   The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level by providers of online_platforms and other relevant service providers, such as providers of online advertising intermediary_services, other actors involved in the programmatic advertising value chain, or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for actors in the online advertising value chain beyond the requirements of Articles 26 and 39.

2.   The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information that fully respects the rights and interests of all parties involved, as well as a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct at least address the following:

(a)

the transmission of information held by providers of online advertising intermediaries to recipients of the service concerning the requirements set in Article 26(1), points (b), (c) and (d);

(b)

the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 39;

(c)

meaningful information on data monetisation.

3.   The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.

4.   The Commission shall encourage all the actors in the online advertising value chain referred to in paragraph 1 to endorse the commitments stated in the codes of conduct, and to comply with them.

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators


whereas









keyboard_arrow_down