search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'reports' . Output generated live by software developed by IusOnDemand srl


expand index reports:


whereas reports:


definitions:


cloud tag: and the number of total unique words without stopwords is: 738

 

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms_and_conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms_and_conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms_and_conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 22

Trusted flaggers

1.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.

2.   The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

(a)

it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal_content;

(b)

it is independent from any provider of online_platforms;

(c)

it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

3.   Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:

(a)

the identity of the provider of hosting services,

(b)

the type of allegedly illegal_content notified,

(c)

the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

4.   Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.

5.   The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.

6.   Where a provider of online_platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online_platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.

7.   The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online_platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.

8.   The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online_platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 36

Crisis response mechanism

1.   Where a crisis occurs, the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of very large online_platforms or of very large online_search_engines to take one or more of the following actions:

(a)

assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in paragraph 2, or are likely to do so;

(b)

identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

(c)

report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision.

When identifying and applying measures pursuant to point (b) of this paragraph, the service provider or providers shall take due account of the gravity of the serious threat referred to in paragraph 2, of the urgency of the measures and of the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

2.   For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

3.   When taking the decision referred to in paragraph 1, the Commission shall ensure that all of the following requirements are met:

(a)

the actions required by the decision are strictly necessary, justified and proportionate, having regard in particular to the gravity of the serious threat referred to in paragraph 2, the urgency of the measures and the actual or potential implications for the rights and legitimate interests of all parties concerned, including the possible failure of the measures to respect the fundamental rights enshrined in the Charter;

(b)

the decision specifies a reasonable period within which specific measures referred to in paragraph 1, point (b), are to be taken, having regard, in particular, to the urgency of those measures and the time needed to prepare and implement them;

(c)

the actions required by the decision are limited to a period not exceeding three months.

4.   After adopting the decision referred to in paragraph 1, the Commission shall, without undue delay, take the following steps:

(a)

notify the decision to the provider or providers to which the decision is addressed;

(b)

make the decision publicly available; and

(c)

inform the Board of the decision, invite it to submit its views thereon, and keep it informed of any subsequent developments relating to the decision.

5.   The choice of specific measures to be taken pursuant to paragraph 1, point (b), and to paragraph 7, second subparagraph, shall remain with the provider or providers addressed by the Commission’s decision.

6.   The Commission may on its own initiative or at the request of the provider, engage in a dialogue with the provider to determine whether, in light of the provider’s specific circumstances, the intended or implemented measures referred to in paragraph 1, point (b), are effective and proportionate in achieving the objectives pursued. In particular, the Commission shall ensure that the measures taken by the service provider under paragraph 1, point (b), meet the requirements referred to in paragraph 3, points (a) and (c).

7.   The Commission shall monitor the application of the specific measures taken pursuant to the decision referred to in paragraph 1 of this Article on the basis of the reports referred to in point (c) of that paragraph and any other relevant information, including information it may request pursuant to Article 40 or 67, taking into account the evolution of the crisis. The Commission shall report regularly to the Board on that monitoring, at least on a monthly basis.

Where the Commission considers that the intended or implemented specific measures pursuant to paragraph 1, point (b), are not effective or proportionate it may, after consulting the Board, adopt a decision requiring the provider to review the identification or application of those specific measures.

8.   Where appropriate in view of the evolution of the crisis, the Commission, acting on the Board’s recommendation, may amend the decision referred to in paragraph 1 or in paragraph 7, second subparagraph, by:

(a)

revoking the decision and, where appropriate, requiring the very large online_platform or very large online_search_engine to cease to apply the measures identified and implemented pursuant to paragraph 1, point (b), or paragraph 7, second subparagraph, in particular where the grounds for such measures do not exist anymore;

(b)

extending the period referred to paragraph 3, point (c), by a period of no more than three months;

(c)

taking account of experience gained in applying the measures, in particular the possible failure of the measures to respect the fundamental rights enshrined in the Charter.

9.   The requirements of paragraphs 1 to 6 shall apply to the decision and to the amendment thereof referred to in this Article.

10.   The Commission shall take utmost account of the recommendation of the Board issued pursuant to this Article.

11.   The Commission shall report to the European Parliament and to the Council on a yearly basis following the adoption of decisions in accordance with this Article, and, in any event, three months after the end of the crisis, on the application of the specific measures taken pursuant to those decisions.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 55

Activity reports

1.   Digital Services Coordinators shall draw up annual reports on their activities under this Regulation, including the number of complaints received pursuant to Article 53 and an overview of their follow-up. The Digital Services Coordinators shall make the annual reports available to the public in a machine-readable format, subject to the applicable rules on the confidentiality of information pursuant to Article 84, and shall communicate them to the Commission and to the Board.

2.   The annual report shall also include the following information:

(a)

the number and subject matter of orders to act against illegal_content and orders to provide information issued in accordance with Articles 9 and 10 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;

(b)

the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 9 and 10.

3.   Where a Member State has designated several competent authorities pursuant to Article 49, it shall ensure that the Digital Services Coordinator draws up a single report covering the activities of all competent authorities and that the Digital Services Coordinator receives all relevant information and support needed to that effect from the other competent authorities concerned.

SECTION 2

Competences, coordinated investigation and consistency mechanisms

Article 63

Tasks of the Board

1.   Where necessary to meet the objectives set out in Article 61(2), the Board shall in particular:

(a)

support the coordination of joint investigations;

(b)

support the competent authorities in the analysis of reports and results of audits of very large online_platforms or of very large online_search_engines to be transmitted pursuant to this Regulation;

(c)

issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation, taking into account, in particular, the freedom to provide services of the providers of intermediary_service;

(d)

advise the Commission on the measures referred to in Article 66 and, adopt opinions concerning very large online_platforms or very large online_search_engines in accordance with this Regulation;

(e)

support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in cooperation with relevant stakeholders as provided for in this Regulation, including by issuing opinions or recommendations on matters related to Article 44, as well as the identification of emerging issues, with regard to matters covered by this Regulation.

2.   Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.

SECTION 4

Supervision, investigation, enforcement and monitoring in respect of providers of very large online_platforms and of very large online_search_engines

Article 91

Review

1.   By 18 February 2027, the Commission shall evaluate and report to the European Parliament, the Council and the European Economic and Social Committee on the potential effect of this Regulation on the development and economic growth of small and medium-sized enterprises.

By 17 November 2025, the Commission shall evaluate and report to the European Parliament, the Council and the European Economic and Social Committee on:

(a)

the application of Article 33, including the scope of providers of intermediary_services covered by the obligations set out in Section 5 of Chapter III of this Regulation;

(b)

the way that this Regulation interacts with other legal acts, in particular the acts referred to in Article 2(3) and (4).

2.   By 17 November 2027, and every five years thereafter, the Commission shall evaluate this Regulation, and report to the European Parliament, the Council and the European Economic and Social Committee.

This report shall address in particular:

(a)

the application of paragraph 1, second subparagraph, points (a) and (b);

(b)

the contribution of this Regulation to the deepening and efficient functioning of the internal market for intermediary_services, in particular as regards the cross-border provision of digital services;

(c)

the application of Articles 13, 16, 20, 21, 45 and 46;

(d)

the scope of the obligations on small and micro enterprises;

(e)

the effectiveness of the supervision and enforcement mechanisms;

(f)

the impact on the respect for the right to freedom of expression and information.

3.   Where appropriate, the report referred to in paragraphs 1 and 2 shall be accompanied by a proposal for amendment of this Regulation.

4.   The Commission shall, in the report referred to in paragraph 2 of this Article, also evaluate and report on the annual reports on their activities by the Digital Services Coordinators provided to the Commission and the Board pursuant to Article 55(1).

5.   For the purpose of paragraph 2, Member States and the Board shall send information on the request of the Commission.

6.   In carrying out the evaluations referred to in paragraph 2, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources, and shall pay specific attention to small and medium-sized enterprises and the position of new competitors.

7.   By 18 February 2027, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and of the application of Article 43, and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking utmost account of the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board.


whereas









keyboard_arrow_down