keyboard_tab Digital Service Act 2022/2065 EN
BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf
- Art. 1 Subject matter
- Art. 2 Scope
- Art. 3 Definitions
- Art. 4 ‘Mere conduit’
- Art. 5 ‘Caching’
- Art. 6 Hosting
- Art. 7 Voluntary own-initiative investigations and legal compliance
- Art. 8 No general monitoring or active fact-finding obligations
- Art. 9 Orders to act against illegal content
- Art. 10 Orders to provide information
- Art. 11 Points of contact for Member States’ authorities, the Commission and the Board
- Art. 12 Points of contact for recipients of the service
- Art. 13 Legal representatives
- Art. 14 Terms and conditions
- Art. 15 Transparency reporting obligations for providers of intermediary services
- Art. 16 Notice and action mechanisms
- Art. 17 Statement of reasons
- Art. 18 Notification of suspicions of criminal offences
- Art. 19 Exclusion for micro and small enterprises
- Art. 20 Internal complaint-handling system
- Art. 21 Out-of-court dispute settlement
- Art. 22 Trusted flaggers
- Art. 23 Measures and protection against misuse
- Art. 24 Transparency reporting obligations for providers of online platforms
- Art. 25 Online interface design and organisation
- Art. 26 Advertising on online platforms
- Art. 27 Recommender system transparency
- Art. 28 Online protection of minors
- Art. 29 Exclusion for micro and small enterprises
- Art. 30 Traceability of traders
- Art. 31 Compliance by design
- Art. 32 Right to information
- Art. 33 Very large online platforms and very large online search engines
- Art. 34 Risk assessment
- Art. 35 Mitigation of risks
- Art. 36 Crisis response mechanism
- Art. 37 Independent audit
- Art. 38 Recommender systems
- Art. 39 Additional online advertising transparency
- Art. 40 Data access and scrutiny
- Art. 41 Compliance function
- Art. 42 Transparency reporting obligations
- Art. 43 Supervisory fee
- Art. 44 Standards
- Art. 45 Codes of conduct
- Art. 46 Codes of conduct for online advertising
- Art. 47 Codes of conduct for accessibility
- Art. 48 Crisis protocols
- Art. 49 Competent authorities and Digital Services Coordinators
- Art. 50 Requirements for Digital Services Coordinators
- Art. 51 Powers of Digital Services Coordinators
- Art. 52 Penalties
- Art. 53 Right to lodge a complaint
- Art. 54 Compensation
- Art. 55 Activity reports
- Art. 56 Competences
- Art. 57 Mutual assistance
- Art. 58 Cross-border cooperation among Digital Services Coordinators
- Art. 59 Referral to the Commission
- Art. 60 Joint investigations
- Art. 61 European Board for Digital Services
- Art. 62 Structure of the Board
- Art. 63 Tasks of the Board
- Art. 64 Development of expertise and capabilities
- Art. 65 Enforcement of obligations of providers of very large online platforms and of very large online search engines
- Art. 66 Initiation of proceedings by the Commission and cooperation in investigation
- Art. 67 Requests for information
- Art. 68 Power to take interviews and statements
- Art. 69 Power to conduct inspections
- Art. 70 Interim measures
- Art. 71 Commitments
- Art. 72 Monitoring actions
- Art. 73 Non-compliance
- Art. 74 Fines
- Art. 75 Enhanced supervision of remedies to address infringements of obligations laid down in Section 5 of Chapter III
- Art. 76 Periodic penalty payments
- Art. 77 Limitation period for the imposition of penalties
- Art. 78 Limitation period for the enforcement of penalties
- Art. 79 Right to be heard and access to the file
- Art. 80 Publication of decisions
- Art. 81 Review by the Court of Justice of the European Union
- Art. 82 Requests for access restrictions and cooperation with national courts
- Art. 83 Implementing acts relating to Commission intervention
- Art. 84 Professional secrecy
- Art. 85 Information sharing system
- Art. 86 Representation
- Art. 87 Exercise of the delegation
- Art. 88 Committee procedure
- Art. 89 Amendments to Directive 2000/31/EC
- Art. 90 Amendment to Directive (EU) 2020/1828
- Art. 91 Review
- Art. 92 Anticipated application to providers of very large online platforms and of very large online search engines
- Article 93 Entry into force and application
CHAPTER I
GENERAL PROVISIONS
CHAPTER II
LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES
CHAPTER III
DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT
SECTION 1
Provisions applicable to all providers of intermediary services
SECTION 2
Additional provisions applicable to providers of hosting services, including online platforms
SECTION 3
Additional provisions applicable to providers of online platforms
SECTION 4
Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders
SECTION 5
Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
SECTION 6
Other provisions concerning due diligence obligations
CHAPTER IV
IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT
SECTION 1
Competent authorities and national Digital Services Coordinators
SECTION 2
Competences, coordinated investigation and consistency mechanisms
SECTION 3
European Board for Digital Services
SECTION 4
Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines
SECTION 5
Common provisions on enforcement
SECTION 6
Delegated and implementing acts
CHAPTER V
FINAL PROVISIONS
- information society service
- recipient of the service
- consumer
- to offer services in the Union
- substantial connection to the Union
- trader
- intermediary service
- mere conduit
- caching
- hosting
- illegal content
- online platform
- online search engine
- dissemination to the public
- distance contract
- online interface
- Digital Services Coordinator of establishment
- Digital Services Coordinator of destination
- active recipient of an online platform
- active recipient of an online search engine
- advertisement
- recommender system
- content moderation
- terms and conditions
- persons with disabilities
- commercial communication
- turnover
- Mere conduit
- Caching
- measures 8
- adapting 7
- risks 6
- very 6
- large 6
- systemic 5
- particular 5
- shall 5
- online_search_engines 4
- cooperation 4
- online_platforms 4
- information 4
- including 4
- rights 3
- such 3
- reports 3
- adjusting 3
- through 3
- processes 3
- service 3
- commission 3
- providers 3
- specific 3
- identified 3
- systems 2
- present 2
- mitigation 2
- appropriate 2
- testing 2
- practices 2
- and 2
- best 2
- taking 2
- the 2
- guidelines 2
- targeted 2
- resources 2
- content_moderation 2
- fundamental 2
- which 2
- tools 2
- they 2
- pursuant 2
- to article 2
- aimed 2
- prominent 2
- include 2
- possible 2
- recipients 2
- services 2
Article 35
Mitigation of risks
1. Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
(a) | adapting the design, features or functioning of their services, including their online_interfaces; |
(b) | adapting their terms_and_conditions and their enforcement; |
(c) | adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation; |
(d) | testing and adapting their algorithmic systems, including their recommender_systems; |
(e) | adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide; |
(f) | reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk; |
(g) | initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21; |
(h) | initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively; |
(i) | taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information; |
(j) | taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate; |
(k) | ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information. |
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
(a) | identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42; |
(b) | best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified. |
Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.
3. The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
whereas