search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'circumstances' . Output generated live by software developed by IusOnDemand srl


expand index circumstances:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES
  • 1 Art. 6 Hosting
  • 1 Art. 8 No general monitoring or active fact-finding obligations

  • CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms
  • 2 Art. 17 Statement of reasons

  • SECTION 3
    Additional provisions applicable to providers of online platforms
  • 3 Art. 23 Measures and protection against misuse

  • SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 2 Art. 36 Crisis response mechanism
  • 1 Art. 37 Independent audit

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators
  • 2 Art. 48 Crisis protocols
  • 1 Art. 51 Powers of Digital Services Coordinators

  • SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas circumstances:


definitions:


cloud tag: and the number of total unique words without stopwords is: 796

 

Article 6

Hosting

1.   Where an information_society_service is provided that consists of the storage of information provided by a recipient_of_the_service, the service provider shall not be liable for the information stored at the request of a recipient_of_the_service, on condition that the provider:

(a)

does not have actual knowledge of illegal activity or illegal_content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal_content is apparent; or

(b)

upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal_content.

2.   Paragraph 1 shall not apply where the recipient_of_the_service is acting under the authority or the control of the provider.

3.   Paragraph 1 shall not apply with respect to the liability under consumer protection law of online_platforms that allow consumers to conclude distance_contracts with traders, where such an online_platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online_platform itself or by a recipient_of_the_service who is acting under its authority or control.

4.   This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State's legal system, to require the service provider to terminate or prevent an infringement.

Article 8

No general monitoring or active fact-finding obligations

No general obligation to monitor the information which providers of intermediary_services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.

Article 17

Statement of reasons

1.   Providers of hosting services shall provide a clear and specific statement of reasons to any affected recipients of the service for any of the following restrictions imposed on the ground that the information provided by the recipient_of_the_service is illegal_content or incompatible with their terms_and_conditions:

(a)

any restrictions of the visibility of specific items of information provided by the recipient_of_the_service, including removal of content, disabling access to content, or demoting content;

(b)

suspension, termination or other restriction of monetary payments;

(c)

suspension or termination of the provision of the service in whole or in part;

(d)

suspension or termination of the recipient_of_the_service's account.

2.   Paragraph 1 shall only apply where the relevant electronic contact details are known to the provider. It shall apply at the latest from the date that the restriction is imposed, regardless of why or how it was imposed.

Paragraph 1 shall not apply where the information is deceptive high-volume commercial content.

3.   The statement of reasons referred to in paragraph 1 shall at least contain the following information:

(a)

information on whether the decision entails either the removal of, the disabling of access to, the demotion of or the restriction of the visibility of the information, or the suspension or termination of monetary payments related to that information, or imposes other measures referred to in paragraph 1 with regard to the information, and, where relevant, the territorial scope of the decision and its duration;

(b)

the facts and circumstances relied on in taking the decision, including, where relevant, information on whether the decision was taken pursuant to a notice submitted in accordance with Article 16 or based on voluntary own-initiative investigations and, where strictly necessary, the identity of the notifier;

(c)

where applicable, information on the use made of automated means in taking the decision, including information on whether the decision was taken in respect of content detected or identified using automated means;

(d)

where the decision concerns allegedly illegal_content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal_content on that ground;

(e)

where the decision is based on the alleged incompatibility of the information with the terms_and_conditions of the provider of hosting services, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground;

(f)

clear and user-friendly information on the possibilities for redress available to the recipient_of_the_service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.

4.   The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient_of_the_service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).

5.   This Article shall not apply to any orders referred to in Article 9.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 36

Crisis response mechanism

1.   Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online_platforms or of very large online_search_engines to take one or more of the following actions:

(a)

assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;

(b)

identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

(c)

report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.

When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

2.   For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

3.   When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:

(a)

the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;

(b)

the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;

(c)

the actions required by the decision are limited to a period not exceeding three months.

4.   After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:

(a)

notify the decision to the provider or providers to which the decision is addressed;

(b)

make the decision publicly available; and

(c)

inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.

5.   The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.

6.   The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).

7.   The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.

Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.

8.   Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:

(a)

revoking the decision and, where appropriate, requiring the very large online_platform or very large online_search_engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;

(b)

extending the period referred to paragraph 3, point (c), by a period of no more than three months;

(c)

taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

9.   The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.

10.   The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.

11.   The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators

Article 51

Powers of Digital Services Coordinators

1.   Where needed in order to carry out their tasks under this Regulation, Digital Services Coordinators shall have the following powers of investigation, in respect of conduct by providers of intermediary_services falling within the competence of their Member State:

(a)

the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including organisations performing the audits referred to in Article 37 and Article 75(2), to provide such information without undue delay;

(b)

the power to carry out, or to request a judicial authority in their Member State to order, inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium;

(c)

the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers with their consent by any technical means.

2.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall have the following enforcement powers, in respect of providers of intermediary_services falling within the competence of their Member State:

(a)

the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding;

(b)

the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end, or to request a judicial authority in their Member State to do so;

(c)

the power to impose fines, or to request a judicial authority in their Member State to do so, in accordance with Article 52 for failure to comply with this Regulation, including with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(d)

the power to impose a periodic penalty payment, or to request a judicial authority in their Member State to do so, in accordance with Article 52 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this subparagraph or for failure to comply with any of the investigative orders issued pursuant to paragraph 1 of this Article;

(e)

the power to adopt interim measures or to request the competent national judicial authority in their Member State to do so, to avoid the risk of serious harm.

As regards the first subparagraph, points (c) and (d), Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after providing those other persons in good time with all relevant information relating to such orders, including the applicable period, the fines or periodic payments that may be imposed for failure to comply and the possibilities for redress.

3.   Where needed for carrying out their tasks under this Regulation, Digital Services Coordinators shall, in respect of providers of intermediary_services falling within the competence of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted and the infringement has not been remedied or is continuing and is causing serious harm which cannot be avoided through the exercise of other powers available under Union or national law, also have the power to take the following measures:

(a)

to require the management body of those providers, without undue delay, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken;

(b)

where the Digital Services Coordinator considers that a provider of intermediary_services has not sufficiently complied with the requirements referred to in point (a), that the infringement has not been remedied or is continuing and is causing serious harm, and that that infringement entails a criminal offence involving a threat to the life or safety of persons, to request that the competent judicial authority of its Member State order the temporary restriction of access of recipients to the service concerned by the infringement or, only where that is not technically feasible, to the online_interface of the provider of intermediary_services on which the infringement takes place.

The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 82, prior to submitting the request referred to in the first subparagraph, point (b), of this paragraph invite interested parties to submit written observations within a period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider of intermediary_services, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned.

The restriction of access shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority. The Digital Services Coordinator shall only extend the period where, having regard to the rights and interests of all parties affected by that restriction and all relevant circumstances, including any information that the provider of intermediary_services, the addressee or addressees and any other third party that demonstrated a legitimate interest may provide to it, it considers that both of the following conditions have been met:

(a)

the provider of intermediary_services has failed to take the necessary measures to terminate the infringement;

(b)

the temporary restriction does not unduly restrict access to lawful information by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist.

Where the Digital Services Coordinator considers that the conditions set out in the third subparagraph, points (a) and (b), have been met but it cannot further extend the period pursuant to the third subparagraph, it shall submit a new request to the competent judicial authority, as referred to in the first subparagraph, point (b).

4.   The powers listed in paragraphs 1, 2 and 3 shall be without prejudice to Section 3.

5.   The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary_services concerned where relevant.

6.   Member States shall lay down specific rules and procedures for the exercise of the powers pursuant to paragraphs 1, 2 and 3 and shall ensure that any exercise of those powers is subject to adequate safeguards laid down in the applicable national law in compliance with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties.


whereas









keyboard_arrow_down