search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'algorithmic' . Output generated live by software developed by IusOnDemand srl


expand index algorithmic:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services
  • 1 Art. 14 Terms and conditions

  • SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms

    SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders

    SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 2 Art. 34 Risk assessment
  • 1 Art. 35 Mitigation of risks
  • 1 Art. 40 Data access and scrutiny

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services

    SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines

    SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas algorithmic:


definitions:


cloud tag: and the number of total unique words without stopwords is: 544

 

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 34

Risk assessment

1.   Providers of very large online_platforms and of very large online_search_engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.

They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

(a)

the dissemination of illegal_content through their services;

(b)

any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;

(c)

any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

(d)

any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

2.   When conducting risk assessments, providers of very large online_platforms and of very large online_search_engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:

(a)

the design of their recommender_systems and any other relevant algorithmic system;

(b)

their content_moderation systems;

(c)

the applicable terms_and_conditions and their enforcement;

(d)

systems for selecting and presenting advertisements;

(e)

data related practices of the provider.

The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal_content and of information that is incompatible with their terms_and_conditions.

The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

3.   Providers of very large online_platforms and of very large online_search_engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital_Services_Coordinator_of_establishment.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 40

Data access and scrutiny

1.   Providers of very large online_platforms or of very large online_search_engines shall provide the Digital_Services_Coordinator_of_establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation.

2.   Digital Services Coordinators and the Commission shall use the data accessed pursuant to paragraph 1 only for the purpose of monitoring and assessing compliance with this Regulation and shall take due account of the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of personal data, the protection of confidential information, in particular trade secrets, and maintaining the security of their service.

3.   For the purposes of paragraph 1, providers of very large online_platforms or of very large online_search_engines shall, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender_systems.

4.   Upon a reasoned request from the Digital_Services_Coordinator_of_establishment, providers of very large online_platforms or of very large online_search_engines shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraph 8 of this Article, for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35.

5.   Within 15 days following receipt of a request as referred to in paragraph 4, providers of very large online_platforms or of very large online_search_engines may request the Digital_Services_Coordinator_of_establishment, to amend the request, where they consider that they are unable to give access to the data requested because one of following two reasons:

(a)

they do not have access to the data;

(b)

giving access to the data will lead to significant vulnerabilities in the security of their service or the protection of confidential information, in particular trade secrets.

6.   Requests for amendment pursuant to paragraph 5 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request.

The Digital_Services_Coordinator_of_establishment shall decide on the request for amendment within 15 days and communicate to the provider of the very large online_platform or of the very large online_search_engine its decision and, where relevant, the amended request and the new period to comply with the request.

7.   Providers of very large online_platforms or of very large online_search_engines shall facilitate and provide access to data pursuant to paragraphs 1 and 4 through appropriate interfaces specified in the request, including online databases or application programming interfaces.

8.   Upon a duly substantiated application from researchers, the Digital_Services_Coordinator_of_establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online_platform or of very large online_search_engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:

(a)

they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;

(b)

they are independent from commercial interests;

(c)

their application discloses the funding of the research;

(d)

they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;

(e)

their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in paragraph 4;

(f)

the planned research activities will be carried out for the purposes laid down in paragraph 4;

(g)

they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679.

Upon receipt of the application pursuant to this paragraph, the Digital_Services_Coordinator_of_establishment shall inform the Commission and the Board.

9.   Researchers may also submit their application to the Digital Services Coordinator of the Member State of the research organisation to which they are affiliated. Upon receipt of the application pursuant to this paragraph the Digital Services Coordinator shall conduct an initial assessment as to whether the respective researchers meet all of the conditions set out in paragraph 8. The respective Digital Services Coordinator shall subsequently send the application, together with the supporting documents submitted by the respective researchers and the initial assessment, to the Digital_Services_Coordinator_of_establishment. The Digital_Services_Coordinator_of_establishment shall take a decision whether to award a researcher the status of ‘vetted researcher’ without undue delay.

While taking due account of the initial assessment provided, the final decision to award a researcher the status of ‘vetted researcher’ lies within the competence of Digital_Services_Coordinator_of_establishment, pursuant to paragraph 8.

10.   The Digital Services Coordinator that awarded the status of vetted researcher and issued the reasoned request for data access to the providers of very large online_platforms or of very large online_search_engines in favour of a vetted researcher shall issue a decision terminating the access if it determines, following an investigation either on its own initiative or on the basis of information received from third parties, that the vetted researcher no longer meets the conditions set out in paragraph 8, and shall inform the provider of the very large online_platform or of the very large online_search_engine concerned of the decision. Before terminating the access, the Digital Services Coordinator shall allow the vetted researcher to react to the findings of its investigation and to its intention to terminate the access.

11.   Digital Services Coordinators of establishment shall communicate to the Board the names and contact information of the natural persons or entities to which they have awarded the status of ‘vetted researcher’ in accordance with paragraph 8, as well as the purpose of the research in respect of which the application was made or, where they have terminated the access to the data in accordance with paragraph 10, communicate that information to the Board.

12.   Providers of very large online_platforms or of very large online_search_engines shall give access without undue delay to data, including, where technically possible, to real-time data, provided that the data is publicly accessible in their online_interface by researchers, including those affiliated to not for profit bodies, organisations and associations, who comply with the conditions set out in paragraph 8, points (b), (c), (d) and (e), and who use the data solely for performing research that contributes to the detection, identification and understanding of systemic risks in the Union pursuant to Article 34(1).

13.   The Commission shall, after consulting the Board, adopt delegated acts supplementing this Regulation by laying down the technical conditions under which providers of very large online_platforms or of very large online_search_engines are to share data pursuant to paragraphs 1 and 4 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with researchers can take place in compliance with Regulation (EU) 2016/679, as well as relevant objective indicators, procedures and, where necessary, independent advisory mechanisms in support of sharing of data, taking into account the rights and interests of the providers of very large online_platforms or of very large online_search_engines and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service.


whereas









keyboard_arrow_down