search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'following:' . Output generated live by software developed by IusOnDemand srl


expand index following::


whereas following::


definitions:


cloud tag: and the number of total unique words without stopwords is: 839

 

Article 2

Scope

1.   This Regulation shall apply to intermediary_services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary_services have their place of establishment.

2.   This Regulation shall not apply to any service that is not an intermediary_service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary_service.

3.   This Regulation shall not affect the application of Directive 2000/31/EC.

4.   This Regulation is without prejudice to the rules laid down by other Union legal acts regulating other aspects of the provision of intermediary_services in the internal market or specifying and complementing this Regulation, in particular, the following:

(a)

Directive 2010/13/EU;

(b)

Union law on copyright and related rights;

(c)

Regulation (EU) 2021/784;

(d)

Regulation (EU) 2019/1148;

(e)

Regulation (EU) 2019/1150;

(f)

Union law on consumer protection and product safety, including Regulations (EU) 2017/2394 and (EU) 2019/1020 and Directives 2001/95/EC and 2013/11/EU;

(g)

Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC;

(h)

Union law in the field of judicial cooperation in civil matters, in particular Regulation (EU) No 1215/2012 or any Union legal act laying down the rules on law applicable to contractual and non-contractual obligations;

(i)

Union law in the field of judicial cooperation in criminal matters, in particular a Regulation on European Production and Preservation Orders for electronic evidence in criminal matters;

(j)

a Directive laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 26

Advertising on online_platforms

1.   Providers of online_platforms that present advertisements on their online_interfaces shall ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:

(a)

that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44;

(b)

the natural or legal person on whose behalf the advertisement is presented;

(c)

the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point (b);

(d)

meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters.

2.   Providers of online_platforms shall provide recipients of the service with a functionality to declare whether the content they provide is or contains commercial_communications.

When the recipient_of_the_service submits a declaration pursuant to this paragraph, the provider of online_platforms shall ensure that other recipients of the service can identify in a clear and unambiguous manner and in real time, including through prominent markings, which might follow standards pursuant to Article 44, that the content provided by the recipient_of_the_service is or contains commercial_communications, as described in that declaration.

3.   Providers of online_platforms shall not present advertisements to recipients of the service based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679.

Article 31

Compliance by design

1.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall ensure that its online_interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law.

In particular, the provider concerned shall ensure that its online_interface enables traders to provide information on the name, address, telephone number and email address of the economic operator, as defined in Article 3, point (13), of Regulation (EU) 2019/1020 and other Union law.

2.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall ensure that its online_interface is designed and organised in a way that it allows traders to provide at least the following:

(a)

the information necessary for the clear and unambiguous identification of the products or the services promoted or offered to consumers located in the Union through the services of the providers;

(b)

any sign identifying the trader such as the trademark, symbol or logo; and,

(c)

where applicable, the information concerning the labelling and marking in compliance with rules of applicable Union law on product safety and product compliance.

3.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall make best efforts to assess whether such traders have provided the information referred to in paragraphs 1 and 2 prior to allowing them to offer their products or services on those platforms. After allowing the trader to offer products or services on its online_platform that allows consumers to conclude distance_contracts with traders, the provider shall make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online_interface whether the products or services offered have been identified as illegal.

Article 32

Right to information

1.   Where a provider of an online_platform allowing consumers to conclude distance_contracts with traders becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider shall inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following:

(a)

the fact that the product or service is illegal;

(b)

the identity of the trader; and

(c)

any relevant means of redress.

The obligation laid down in the first subparagraph shall be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality.

2.   Where, in the situation referred to in paragraph 1, the provider of the online_platform allowing consumers to conclude distance_contracts with traders does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online_interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress.

SECTION 5

Additional obligations for providers of very large online_platforms and of very large online_search_engines to manage systemic risks

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 46

Codes of conduct for online advertising

1.   The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level by providers of online_platforms and other relevant service providers, such as providers of online advertising intermediary_services, other actors involved in the programmatic advertising value chain, or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for actors in the online advertising value chain beyond the requirements of Articles 26 and 39.

2.   The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information that fully respects the rights and interests of all parties involved, as well as a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct at least address the following:

(a)

the transmission of information held by providers of online advertising intermediaries to recipients of the service concerning the requirements set in Article 26(1), points (b), (c) and (d);

(b)

the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 39;

(c)

meaningful information on data monetisation.

3.   The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.

4.   The Commission shall encourage all the actors in the online advertising value chain referred to in paragraph 1 to endorse the commitments stated in the codes of conduct, and to comply with them.

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators

Article 73

Non-compliance

1.   The Commission shall adopt a non-compliance decision where it finds that the provider of the very large online_platform or of the very large online_search_engine concerned does not comply with one or more of the following:

(a)

the relevant provisions of this Regulation;

(b)

interim measures ordered pursuant to Article 70;

(c)

commitments made binding pursuant to Article 71.

2.   Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the provider of the very large online_platform or of the very large online_search_engine concerned. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the provider of the very large online_platform or of the very large online_search_engine concerned should take, in order to effectively address the preliminary findings.

3.   In the decision adopted pursuant to paragraph 1 the Commission shall order the provider of the very large online_platform or of the very large online_search_engine concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable period specified therein and to provide information on the measures that that provider intends to take to comply with the decision.

4.   The provider of the very large online_platform or of the very large online_search_engine concerned shall provide the Commission with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation.

5.   Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. The decision shall apply with immediate effect.

Article 77

Limitation period for the imposition of penalties

1.   The powers conferred on the Commission by Articles 74 and 76 shall be subject to a limitation period of five years.

2.   Time shall begin to run on the day on which the infringement is committed. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases.

3.   Any action taken by the Commission or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following:

(a)

requests for information by the Commission or by a Digital Services Coordinator;

(b)

inspection;

(c)

the opening of a proceeding by the Commission pursuant to Article 66(1).

4.   Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period has been suspended pursuant to paragraph 5.

5.   The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the Commission is the subject of proceedings pending before the Court of Justice of the European Union.


whereas









keyboard_arrow_down