search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'making' . Output generated live by software developed by IusOnDemand srl


expand index making:


whereas making:


definitions:


cloud tag: and the number of total unique words without stopwords is: 948

 

Article 3

Definitions

For the purpose of this Regulation, the following definitions shall apply:

(a)

information_society_service’ means a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

(b)

recipient_of_the_service’ means any natural or legal person who uses an intermediary_service, in particular for the purposes of seeking information or making it accessible;

(c)

consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

(d)

to_offer_services_in_the_Union’ means enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary_services that has a substantial_connection_to_the_Union;

(e)

substantial_connection_to_the_Union’ means a connection of a provider of intermediary_services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as:

a significant number of recipients of the service in one or more Member States in relation to its or their population; or

the targeting of activities towards one or more Member States;

(f)

trader’ means any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

(g)

intermediary_service’ means one of the following information_society_services:

(i)

a ‘ mere_conduit’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network;

(ii)

a ‘ caching’ service, consisting of the transmission in a communication network of information provided by a recipient_of_the_service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

(iii)

a ‘ hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient_of_the_service;

(h)

illegal_content’ means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

(i)

online_platform’ means a hosting service that, at the request of a recipient_of_the_service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

(j)

online_search_engine’ means an intermediary_service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

(k)

dissemination_to_the_public’ means making information available, at the request of the recipient_of_the_service who provided the information, to a potentially unlimited number of third parties;

(l)

distance_contract’ means ‘ distance_contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

(m)

online_interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

(n)

Digital_Services_Coordinator_of_establishment’ means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary_service is located or its legal representative resides or is established;

(o)

Digital_Services_Coordinator_of_destination’ means the Digital Services Coordinator of a Member State where the intermediary_service is provided;

(p)

‘active recipient of an online_platform’ means a recipient_of_the_service that has engaged with an online_platform by either requesting the online_platform to host information or being exposed to information hosted by the online_platform and disseminated through its online_interface;

(q)

‘active recipient of an online_search_engine’ means a recipient_of_the_service that has submitted a query to an online_search_engine and been exposed to information indexed and presented on its online_interface;

(r)

advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and presented by an online_platform on its online_interface against remuneration specifically for promoting that information;

(s)

recommender_system’ means a fully or partially automated system used by an online_platform to suggest in its online_interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient_of_the_service or otherwise determining the relative order or prominence of information displayed;

(t)

content_moderation’ means the activities, whether automated or not, undertaken by providers of intermediary_services, that are aimed, in particular, at detecting, identifying and addressing illegal_content or information incompatible with their terms_and_conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal_content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

(u)

terms_and_conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary_services and the recipients of the service;

(v)

persons_with_disabilities’ means ‘ persons_with_disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council (38);

(w)

commercial_communication’ means ‘ commercial_communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

(x)

turnover’ means the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004 (39).

CHAPTER II

LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

Article 5

Caching

1.   Where an information_society_service is provided that consists of the transmission in a communication network of information provided by a recipient_of_the_service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or more secure the information's onward transmission to other recipients of the service upon their request, on condition that the provider:

(a)

does not modify the information;

(b)

complies with conditions on access to the information;

(c)

complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;

(d)

does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and

(e)

acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a judicial or an administrative authority has ordered such removal or disablement.

2.   This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State’s legal system, to require the service provider to terminate or prevent an infringement.

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 16

Notice and action mechanisms

1.   Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal_content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.

2.   The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:

(a)

a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal_content;

(b)

a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal_content adapted to the type of content and to the specific type of hosting service;

(c)

the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;

(d)

a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.   Notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

4.   Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.

5.   The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.

6.   Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.

Article 25

Online interface design and organisation

1.   Providers of online_platforms shall not design, organise or operate their online_interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.

2.   The prohibition in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.

3.   The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably:

(a)

giving more prominence to certain choices when asking the recipient_of_the_service for a decision;

(b)

repeatedly requesting that the recipient_of_the_service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience;

(c)

making the procedure for terminating a service more difficult than subscribing to it.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 40

Data access and scrutiny

1.   Providers of very large online_platforms or of very large online_search_engines shall provide the Digital_Services_Coordinator_of_establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation.

2.   Digital Services Coordinators and the Commission shall use the data accessed pursuant to paragraph 1 only for the purpose of monitoring and assessing compliance with this Regulation and shall take due account of the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of personal data, the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

3.   For the purposes of paragraph 1, providers of very large online_platforms or of very large online_search_engines shall, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender_systems.

4.   Upon a reasoned request from the Digital_Services_Coordinator_of_establishment, providers of very large online_platforms or of very large online_search_engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.

5.   Within 15 days following receipt of a request as referred to in paragraph 4, providers of very large online_platforms or of very large online_search_engines may request the Digital_Services_Coordinator_of_establishment, to amend the request, where they consider that they are unable to give access to the data requested because one of following two reasons:

(a)

they do not have access to the data;

(b)

giving access to the data will lead to significant vulnerabilities in the security of their service or the protection of confidential information, in particular trade secrets.

6.   Requests for amendment pursuant to paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request.

The Digital_Services_Coordinator_of_establishment shall decide on the request for amendment within 15 days and communicate to the provider of the very large online_platform or of the very large online_search_engine its decision and, where relevant, the amended request and the new period to comply with the request.

7.   Providers of very large online_platforms or of very large online_search_engines shall facilitate and provide access to data pursuant to paragraphs 1 and 4 through appropriate interfaces specified in the request, including online databases or application programming interfaces.

8.   Upon a duly substantiated application from researchers, the Digital_Services_Coordinator_of_establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online_platform or of very large online_search_engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:

(a)

they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;

(b)

they are independent from commercial interests;

(c)

their application discloses the funding of the research;

(d)

they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;

(e)

their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in paragraph 4;

(f)

the planned research activities will be carried out for the purposes laid down in paragraph 4;

(g)

they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679.

Upon receipt of the application pursuant to this paragraph, the Digital_Services_Coordinator_of_establishment shall inform the Commission and the Board.

9.   Researchers may also submit their application to the Digital Services Coordinator of the Member State of the research organisation to which they are affiliated. Upon receipt of the application pursuant to this paragraph the Digital Services Coordinator shall conduct an initial assessment as to whether the respective researchers meet all of the conditions set out in paragraph 8. The respective Digital Services Coordinator shall subsequently send the application, together with the supporting documents submitted by the respective researchers and the initial assessment, to the Digital_Services_Coordinator_of_establishment. The Digital_Services_Coordinator_of_establishment shall take a decision whether to award a researcher the status of ‘vetted researcher’ without undue delay.

While taking due account of the initial assessment provided, the final decision to award a researcher the status of ‘vetted researcher’ lies within the competence of Digital_Services_Coordinator_of_establishment, pursuant to paragraph 8.

10.   The Digital Services Coordinator that awarded the status of vetted researcher and issued the reasoned request for data access to the providers of very large online_platforms or of very large online_search_engines in favour of a vetted researcher shall issue a decision terminating the access if it determines, following an investigation either on its own initiative or on the basis of information received from third parties, that the vetted researcher no longer meets the conditions set out in paragraph 8, and shall inform the provider of the very large online_platform or of the very large online_search_engine concerned of the decision. Before terminating the access, the Digital Services Coordinator shall allow the vetted researcher to react to the findings of its investigation and to its intention to terminate the access.

11.   Digital Services Coordinators of establishment shall communicate to the Board the names and contact information of the natural persons or entities to which they have awarded the status of ‘vetted researcher’ in accordance with paragraph 8, as well as the purpose of the research in respect of which the application was made or, where they have terminated the access to the data in accordance with paragraph 10, communicate that information to the Board.

12.   Providers of very large online_platforms or of very large online_search_engines shall give access without undue delay to data, including, where technically possible, to real-time data, provided that the data is publicly accessible in their online_interface by researchers, including those affiliated to not for profit bodies, organisations and associations, who comply with the conditions set out in paragraph 8, points (b), (c), (d) and (e), and who use the data solely for performing research that contributes to the detection, identification and understanding of systemic risks in the Union pursuant to Article 34(1).

13.   The Commission shall, after consulting the Board, adopt delegated acts supplementing this Regulation by laying down the technical conditions under which providers of very large online_platforms or of very large online_search_engines are to share data pursuant to paragraphs 1 and 4 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with researchers can take place in compliance with Regulation (EU) 2016/679, as well as relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data, taking into account the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

Article 47

Codes of conduct for accessibility

1.   The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level with the involvement of providers of online_platforms and other relevant service providers, organisations representing recipients of the service and civil society organisations or relevant authorities to promote full and effective, equal participation, by improving access to online services that, through their initial design or subsequent adaptation, address the particular needs of persons_with_disabilities.

2.   The Commission shall aim to ensure that the codes of conduct pursue the objective of ensuring that those services are accessible in compliance with Union and national law, in order to maximise their foreseeable use by persons_with_disabilities. The Commission shall aim to ensure that the codes of conduct address at least the following objectives:

(a)

designing and adapting services to make them accessible to persons_with_disabilities by making them perceivable, operable, understandable and robust;

(b)

explaining how the services meet the applicable accessibility requirements and making this information available to the public in an accessible manner for persons_with_disabilities;

(c)

making information, forms and measures provided pursuant to this Regulation available in such a manner that they are easy to find, easy to understand, and accessible to persons_with_disabilities.

3.   The Commission shall encourage the development of the codes of conduct by 18 February 2025 and their application by 18 August 2025.

Article 64

Development of expertise and capabilities

1.   The Commission, in cooperation with the Digital Services Coordinators and the Board, shall develop Union expertise and capabilities, including, where appropriate, through the secondment of Member States’ personnel.

2.   In addition, the Commission, in cooperation with the Digital Services Coordinators and the Board, shall coordinate the assessment of systemic and emerging issues across the Union in relation to very large online_platforms or very large online_search_engines with regard to matters covered by this Regulation.

3.   The Commission may ask the Digital Services Coordinators, the Board and other Union bodies, offices and agencies with relevant expertise to support the assessment of systemic and emerging issues across the Union under this Regulation.

4.   Member States shall cooperate with the Commission, in particular through their respective Digital Services Coordinators and other competent authorities, where applicable, including by making available their expertise and capabilities.

Article 82

Requests for access restrictions and cooperation with national courts

1.   Where all powers pursuant to this Section to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission may request the Digital_Services_Coordinator_of_establishment of the provider of the very large online_platform or of the very large online_search_engine concerned to act pursuant to Article 51(3).

Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a period that shall not be less than 14 working days, describing the measures it intends to request and identifying the intended addressee or addressees thereof.

2.   Where the coherent application of this Regulation so requires, the Commission, acting on its own initiative, may submit written observations to the competent judicial authority referred to Article 51(3). With the permission of the judicial authority in question, it may also make oral observations.

For the purpose of the preparation of its observations only, the Commission may request that judicial authority to transmit or ensure the transmission to it of any documents necessary for the assessment of the case.

3.   When a national court rules on a matter which is already the subject matter of a decision adopted by the Commission under this Regulation, that national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings it has initiated under this Regulation. To that effect, a national court may assess whether it is necessary to stay its proceedings. This is without prejudice to Article 267 TFEU.

Article 87

Exercise of the delegation

1.   The power to adopt delegated acts is conferred on the Commission subject to the conditions laid down in this Article.

2.   The delegation of power referred to in Articles 24, 33, 37, 40 and 43 shall be conferred on the Commission for five years starting from 16 November 2022. The Commission shall draw up a report in respect of the delegation of power not later than nine months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than three months before the end of each period.

3.   The delegation of power referred to in Articles 24, 33, 37, 40 and 43 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.

4.   Before adopting a delegated act, the Commission shall consult experts designated by each Member State in accordance with the principles laid down in the Interinstitutional Agreement of 13 April 2016 on Better Law-making.

5.   As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council.

6.   A delegated act adopted pursuant to Articles 24, 33, 37, 40 and 43 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.


whereas









keyboard_arrow_down