search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'enable' . Output generated live by software developed by IusOnDemand srl


expand index enable:


whereas enable:


definitions:


cloud tag: and the number of total unique words without stopwords is: 796

 

Article 6

Hosting

1.   Where an information_society_service is provided that consists of the storage of information provided by a recipient_of_the_service, the service provider shall not be liable for the information stored at the request of a recipient_of_the_service, on condition that the provider:

(a)

does not have actual knowledge of illegal activity or illegal_content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal_content is apparent; or

(b)

upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal_content.

2.   Paragraph 1 shall not apply where the recipient_of_the_service is acting under the authority or the control of the provider.

3.   Paragraph 1 shall not apply with respect to the liability under consumer protection law of online_platforms that allow consumers to conclude distance_contracts with traders, where such an online_platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online_platform itself or by a recipient_of_the_service who is acting under its authority or control.

4.   This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State's legal system, to require the service provider to terminate or prevent an infringement.

Article 11

Points of contact for Member States’ authorities, the Commission and the Board

1.   Providers of intermediary_services shall designate a single point of contact to enable them to communicate directly, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 61 for the application of this Regulation.

2.   Providers of intermediary_services shall make public the information necessary to easily identify and communicate with their single points of contact. That information shall be easily accessible, and shall be kept up to date.

3.   Providers of intermediary_services shall specify in the information referred to in paragraph 2 the official language or languages of the Member States which, in addition to a language broadly understood by the largest possible number of Union citizens, can be used to communicate with their points of contact, and which shall include at least one of the official languages of the Member State in which the provider of intermediary_services has its main establishment or where its legal representative resides or is established.

Article 12

Points of contact for recipients of the service

1.   Providers of intermediary_services shall designate a single point of contact to enable recipients of the service to communicate directly and rapidly with them, by electronic means and in a user-friendly manner, including by allowing recipients of the service to choose the means of communication, which shall not solely rely on automated tools.

2.   In addition to the obligations provided under Directive 2000/31/EC, providers of intermediary_services shall make public the information necessary for the recipients of the service in order to easily identify and communicate with their single points of contact. That information shall be easily accessible, and shall be kept up to date.

Article 16

Notice and action mechanisms

1.   Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal_content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.

2.   The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:

(a)

a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal_content;

(b)

a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal_content adapted to the type of content and to the specific type of hosting service;

(c)

the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;

(d)

a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.   Notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

4.   Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.

5.   The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.

6.   Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.

Article 20

Internal complaint-handling system

1.   Providers of online_platforms shall provide recipients of the service, including individuals or entities that have submitted a notice, for a period of at least six months following the decision referred to in this paragraph, with access to an effective internal complaint-handling system that enables them to lodge complaints, electronically and free of charge, against the decision taken by the provider of the online_platform upon the receipt of a notice or against the following decisions taken by the provider of the online_platform on the grounds that the information provided by the recipients constitutes illegal_content or is incompatible with its terms_and_conditions:

(a)

decisions whether or not to remove or disable access to or restrict visibility of the information;

(b)

decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;

(c)

decisions whether or not to suspend or terminate the recipients’ account;

(d)

decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients.

2.   The period of at least six months referred to in paragraph 1 of this Article shall start on the day on which the recipient_of_the_service is informed about the decision in accordance with Article 16(5) or Article 17.

3.   Providers of online_platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints.

4.   Providers of online_platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and non-arbitrary manner. Where a complaint contains sufficient grounds for the provider of the online_platform to consider that its decision not to act upon the notice is unfounded or that the information to which the complaint relates is not illegal and is not incompatible with its terms_and_conditions, or contains information indicating that the complainant’s conduct does not warrant the measure taken, it shall reverse its decision referred to in paragraph 1 without undue delay.

5.   Providers of online_platforms shall inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 and other available possibilities for redress.

6.   Providers of online_platforms shall ensure that the decisions, referred to in paragraph 5, are taken under the supervision of appropriately qualified staff, and not solely on the basis of automated means.

Article 31

Compliance by design

1.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall ensure that its online_interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law.

In particular, the provider concerned shall ensure that its online_interface enables traders to provide information on the name, address, telephone number and email address of the economic operator, as defined in Article 3, point (13), of Regulation (EU) 2019/1020 and other Union law.

2.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall ensure that its online_interface is designed and organised in a way that it allows traders to provide at least the following:

(a)

the information necessary for the clear and unambiguous identification of the products or the services promoted or offered to consumers located in the Union through the services of the providers;

(b)

any sign identifying the trader such as the trademark, symbol or logo; and,

(c)

where applicable, the information concerning the labelling and marking in compliance with rules of applicable Union law on product safety and product compliance.

3.   Providers of online_platforms allowing consumers to conclude distance_contracts with traders shall make best efforts to assess whether such traders have provided the information referred to in paragraphs 1 and 2 prior to allowing them to offer their products or services on those platforms. After allowing the trader to offer products or services on its online_platform that allows consumers to conclude distance_contracts with traders, the provider shall make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online_interface whether the products or services offered have been identified as illegal.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.

Article 69

Power to conduct inspections

1.   In order to carry out the tasks assigned to it under this Section, the Commission may conduct all necessary inspections at the premises of the provider of the very large online_platform or of the very large online_search_engine concerned or of another person referred to in Article 67(1).

2.   The officials and other accompanying persons authorised by the Commission to conduct an inspection shall be empowered to:

(a)

enter any premises, land and means of transport of the provider of the very large online_platform or of the very large online_search_engine concerned or of the other person concerned;

(b)

examine the books and other records related to the provision of the service concerned, irrespective of the medium on which they are stored;

(c)

take or obtain in any form copies of or extracts from such books or other records;

(d)

require the provider of the very large online_platform or of the very large online_search_engine or the other person concerned to provide access to and explanations on its organisation, functioning, IT system, algorithms, data-handling and business practices and to record or document the explanations given;

(e)

seal any premises used for purposes related to the trade, business, craft or profession of the provider of the very large online_platform or of the very large online_search_engine or of the other person concerned, as well as books or other records, for the period and to the extent necessary for the inspection;

(f)

ask any representative or member of staff of the provider of the very large online_platform or of the very large online_search_engine or the other person concerned for explanations on facts or documents relating to the subject-matter and purpose of the inspection and to record the answers;

(g)

address questions to any such representative or member of staff relating to the subject-matter and purpose of the inspection and to record the answers.

3.   Inspections may be carried out with the assistance of auditors or experts appointed by the Commission pursuant to Article 72(2), and of Digital Services Coordinator or other competent national authorities of the Member State in the territory of which the inspection is conducted.

4.   Where the production of required books or other records related to the provision of the service concerned is incomplete or where the answers to questions asked under paragraph 2 of this Article are incorrect, incomplete or misleading, the officials and other accompanying persons authorised by the Commission to conduct an inspection shall exercise their powers upon production of a written authorisation specifying the subject matter and purpose of the inspection and the penalties provided for in Articles 74 and 76. In good time before the inspection, the Commission shall inform the Digital Services Coordinator of the Member State in the territory in which the inspection is to be conducted thereof.

5.   During inspections, the officials and other accompanying persons authorised by the Commission, the auditors and experts appointed by the Commission, the Digital Services Coordinator or the other competent authorities of the Member State in the territory of which the inspection is conducted may require the provider of the very large online_platform or of the very large online_search_engine or other person concerned to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts, and may address questions to its key personnel.

6.   The provider of the very large online_platform or of the very large online_search_engine or other natural or legal person concerned shall be required to submit to an inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the inspection, set the date on which it is to begin and indicate the penalties provided for in Articles 74 and 76 and the right to have the decision reviewed by the Court of Justice of the European Union. The Commission shall consult the Digital Services Coordinator of the Member State on territory of which the inspection is to be conducted prior to taking that decision.

7.   Officials of, and other persons authorised or appointed by, the Digital Services Coordinator of the Member State on the territory of which the inspection is to be conducted shall, at the request of that Digital Services Coordinator or of the Commission, actively assist the officials and other accompanying persons authorised by the Commission in relation to the inspection. To this end, they shall have the powers listed in paragraph 2.

8.   Where the officials and other accompanying persons authorised by the Commission find that the provider of the very large online_platform or of the very large online_search_engine or the other person concerned opposes an inspection ordered pursuant to this Article, the Member State in the territory of which the inspection is to be conducted shall, at the request of those officials or other accompanying persons and in accordance with the national law of the Member State, afford them necessary assistance, including, where appropriate under that national law, in the form of coercive measures taken by a competent law enforcement authority, so as to enable them to conduct the inspection.

9.   If the assistance provided for in paragraph 8 requires authorisation from a national judicial authority in accordance with the national law of the Member State concerned, such authorisation shall be applied for by the Digital Services Coordinator of that Member State at the request of the officials and other accompanying persons authorised by the Commission. Such authorisation may also be applied for as a precautionary measure.

10.   Where the authorisation referred to in paragraph 9 is applied for, the national judicial authority before which a case has been brought shall verify that the Commission decision ordering the inspection is authentic and that the coercive measures envisaged are neither arbitrary nor excessive having regard to the subject matter of the inspection. When conducting such verification, the national judicial authority may ask the Commission, directly or through the Digital Services Coordinators of the Member State concerned, for detailed explanations, in particular those concerning the grounds on which the Commission suspects an infringement of this Regulation, concerning the seriousness of the suspected infringement and concerning the nature of the involvement of the provider of the very large online_platform or of the very large online_search_engine or of the other person concerned. However, the national judicial authority shall not call into question the necessity for the inspection nor demand information from the case file of the Commission. The lawfulness of the Commission decision shall be subject to review only by the Court of Justice of the European Union.


whereas









keyboard_arrow_down