search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'consultation' . Output generated live by software developed by IusOnDemand srl


expand index consultation:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms

    SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 1 Art. 35 Mitigation of risks
  • 1 Art. 39 Additional online advertising transparency
  • 1 Art. 42 Transparency reporting obligations

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services
  • 1 Art. 62 Structure of the Board

  • SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas consultation:


definitions:


cloud tag: and the number of total unique words without stopwords is: 448

 

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 39

Additional online advertising transparency

1.   Providers of very large online_platforms or of very large online_search_engines that present advertisements on their online_interfaces shall compile and make publicly available in a specific section of their online_interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online_interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.

2.   The repository shall include at least all of the following information:

(a)

the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;

(b)

the natural or legal person on whose behalf the advertisement is presented;

(c)

the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);

(d)

the period during which the advertisement was presented;

(e)

whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;

(f)

the commercial_communications published on the very large online_platforms and identified pursuant to Article 26(2);

(g)

the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted.

3.   As regards paragraph 2, points (a), (b) and (c), where a provider of very large online_platform or of very large online_search_engine has removed or disabled access to a specific advertisement based on alleged illegality or incompatibility with its terms_and_conditions, the repository shall not include the information referred to in those points. In such case, the repository shall include, for the specific advertisement concerned, the information referred to in Article 17(3), points (a) to (e), or Article 9(2), point (a)(i), as applicable.

The Commission may, after consultation of the Board, the relevant vetted researchers referred to in Article 40 and the public, issue guidelines on the structure, organisation and functionalities of the repositories referred to in this Article.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 62

Structure of the Board

1.   The Board shall be composed of Digital Services Coordinators who shall be represented by high-level officials. The failure by one or more Member States to designate a Digital Services Coordinator shall not prevent the Board from performing its tasks under this Regulation. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator may participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them.

2.   The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance with the tasks of the Board pursuant to this Regulation and in line with its rules of procedure. When the Board is requested to adopt a recommendation pursuant to this Regulation, it shall immediately make the request available to other Digital Services Coordinators through the information sharing system set out in Article 85.

3.   Each Member State shall have one vote. The Commission shall not have voting rights.

The Board shall adopt its acts by simple majority. When adopting a recommendation to the Commission referred to in Article 36(1), first subparagraph, the Board shall vote within 48 hours after the request of the Chair of the Board.

4.   The Commission shall provide administrative and analytical support for the activities of the Board pursuant to this Regulation.

5.   The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available.

6.   The Board may consult interested parties, and shall make the results of such consultation publicly available.

7.   The Board shall adopt its rules of procedure, following the consent of the Commission.


whereas









keyboard_arrow_down