search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'notices' . Output generated live by software developed by IusOnDemand srl


expand index notices:


whereas notices:


definitions:


cloud tag: and the number of total unique words without stopwords is: 775

 

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms_and_conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms_and_conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms_and_conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 16

Notice and action mechanisms

1.   Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal_content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.

2.   The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:

(a)

a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal_content;

(b)

a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal_content adapted to the type of content and to the specific type of hosting service;

(c)

the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;

(d)

a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.   notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

4.   Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.

5.   The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.

6.   Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.

Article 21

Out-of-court dispute settlement

1.   Recipients of the service, including individuals or entities that have submitted notices, addressed by the decisions referred to in Article 20(1) shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 3 of this Article in order to resolve disputes relating to those decisions, including complaints that have not been resolved by means of the internal complaint-handling system referred to in that Article.

Providers of online_platforms shall ensure that information about the possibility for recipients of the service to have access to an out-of-court dispute settlement, as referred to in the first subparagraph, is easily accessible on their online_interface, clear and user-friendly.

The first subparagraph is without prejudice to the right of the recipient_of_the_service concerned to initiate, at any stage, proceedings to contest those decisions by the providers of online_platforms before a court in accordance with the applicable law.

2.   Both parties shall engage, in good faith, with the selected certified out-of-court dispute settlement body with a view to resolving the dispute.

Providers of online_platforms may refuse to engage with such out-of-court dispute settlement body if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content.

The certified out-of-court dispute settlement body shall not have the power to impose a binding settlement of the dispute on the parties.

3.   The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, for a maximum period of five years, which may be renewed, certify the body, at its request, where the body has demonstrated that it meets all of the following conditions:

(a)

it is impartial and independent, including financially independent, of providers of online_platforms and of recipients of the service provided by providers of online_platforms, including of individuals or entities that have submitted notices;

(b)

it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal_content, or in relation to the application and enforcement of terms_and_conditions of one or more types of online_platform, allowing the body to contribute effectively to the settlement of a dispute;

(c)

its members are remunerated in a way that is not linked to the outcome of the procedure;

(d)

the out-of-court dispute settlement that it offers is easily accessible, through electronic communications technology and provides for the possibility to initiate the dispute settlement and to submit the requisite supporting documents online;

(e)

it is capable of settling disputes in a swift, efficient and cost-effective manner and in at least one of the official languages of the institutions of the Union;

(f)

the out-of-court dispute settlement that it offers takes place in accordance with clear and fair rules of procedure that are easily and publicly accessible, and that comply with applicable law, including this Article.

The Digital Services Coordinator shall, where applicable, specify in the certificate:

(a)

the particular issues to which the body’s expertise relates, as referred to in point (b) of the first subparagraph; and

(b)

the official language or languages of the institutions of the Union in which the body is capable of settling disputes, as referred to in point (e) of the first subparagraph.

4.   Certified out-of-court dispute settlement bodies shall report to the Digital Services Coordinator that certified them, on an annual basis, on their functioning, specifying at least the number of disputes they received, the information about the outcomes of those disputes, the average time taken to resolve them and any shortcomings or difficulties encountered. They shall provide additional information at the request of that Digital Services Coordinator.

Digital Services Coordinators shall, every two years, draw up a report on the functioning of the out-of-court dispute settlement bodies that they certified. That report shall in particular:

(a)

list the number of disputes that each certified out-of-court dispute settlement body has received annually;

(b)

indicate the outcomes of the procedures brought before those bodies and the average time taken to resolve the disputes;

(c)

identify and explain any systematic or sectoral shortcomings or difficulties encountered in relation to the functioning of those bodies;

(d)

identify best practices concerning that functioning;

(e)

make recommendations as to how to improve that functioning, where appropriate.

Certified out-of-court dispute settlement bodies shall make their decisions available to the parties within a reasonable period of time and no later than 90 calendar days after the receipt of the complaint. In the case of highly complex disputes, the certified out-of-court dispute settlement body may, at its own discretion, extend the 90 calendar day period for an additional period that shall not exceed 90 days, resulting in a maximum total duration of 180 days.

5.   If the out-of-court dispute settlement body decides the dispute in favour of the recipient_of_the_service, including the individual or entity that has submitted a notice, the provider of the online_platform shall bear all the fees charged by the out-of-court dispute settlement body, and shall reimburse that recipient, including the individual or entity, for any other reasonable expenses that it has paid in relation to the dispute settlement. If the out-of-court dispute settlement body decides the dispute in favour of the provider of the online_platform, the recipient_of_the_service, including the individual or entity, shall not be required to reimburse any fees or other expenses that the provider of the online_platform paid or is to pay in relation to the dispute settlement, unless the out-of-court dispute settlement body finds that that recipient manifestly acted in bad faith.

The fees charged by the out-of-court dispute settlement body to the providers of online_platforms for the dispute settlement shall be reasonable and shall in any event not exceed the costs incurred by the body. For recipients of the service, the dispute settlement shall be available free of charge or at a nominal fee.

Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient_of_the_service, including to the individuals or entities that have submitted a notice, and to the provider of the online_platform concerned, before engaging in the dispute settlement.

6.   Member States may establish out-of-court dispute settlement bodies for the purposes of paragraph 1 or support the activities of some or all out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3.

Member States shall ensure that any of their activities undertaken under the first subparagraph do not affect the ability of their Digital Services Coordinators to certify the bodies concerned in accordance with paragraph 3.

7.   A Digital Services Coordinator that has certified an out-of-court dispute settlement body shall revoke that certification if it determines, following an investigation either on its own initiative or on the basis of the information received by third parties, that the out-of-court dispute settlement body no longer meets the conditions set out in paragraph 3. Before revoking that certification, the Digital Services Coordinator shall afford that body an opportunity to react to the findings of its investigation and its intention to revoke the out-of-court dispute settlement body’s certification.

8.   Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 3, including where applicable the specifications referred to in the second subparagraph of that paragraph, as well as the out-of-court dispute settlement bodies the certification of which they have revoked. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website that is easily accessible, and keep it up to date.

9.   This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive.

Article 22

Trusted flaggers

1.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.

2.   The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

(a)

it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal_content;

(b)

it is independent from any provider of online_platforms;

(c)

it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

3.   Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:

(a)

the identity of the provider of hosting services,

(b)

the type of allegedly illegal_content notified,

(c)

the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

4.   Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.

5.   The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.

6.   Where a provider of online_platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online_platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.

7.   The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online_platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.

8.   The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online_platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 44

Standards

1.   The Commission shall consult the Board, and shall support and promote the development and implementation of voluntary standards set by relevant European and international standardisation bodies, at least in respect of the following:

(a)

electronic submission of notices under Article 16;

(b)

templates, design and process standards for communicating with the recipients of the service in a user-friendly manner on restrictions resulting from terms_and_conditions and changes thereto;

(c)

electronic submission of notices by trusted flaggers under Article 22, including through application programming interfaces;

(d)

specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 39 and 40;

(e)

auditing of very large online_platforms and of very large online_search_engines pursuant to Article 37;

(f)

interoperability of the advertisement repositories referred to in Article 39(2);

(g)

transmission of data between advertising intermediaries in support of transparency obligations pursuant to Article 26(1), points (b), (c) and (d);

(h)

technical measures to enable compliance with obligations relating to advertising contained in this Regulation, including the obligations regarding prominent markings for advertisements and commercial_communications referred to in Article 26;

(i)

choice interfaces and presentation of information on the main parameters of different types of recommender_systems, in accordance with Articles 27 and 38;

(j)

standards for targeted measures to protect minors online.

2.   The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. The relevant information regarding the update of the standards shall be publicly available and easily accessible.


whereas









keyboard_arrow_down