keyboard_tab Digital Service Act 2022/2065 EN
BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf
- 3 Art. 35 Mitigation of risks
CHAPTER I
GENERAL PROVISIONS
CHAPTER II
LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES
CHAPTER III
DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT
SECTION 1
Provisions applicable to all providers of intermediary services
SECTION 2
Additional provisions applicable to providers of hosting services, including online platforms
SECTION 3
Additional provisions applicable to providers of online platforms
SECTION 4
Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders
SECTION 5
Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
SECTION 6
Other provisions concerning due diligence obligations
CHAPTER IV
IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT
SECTION 1
Competent authorities and national Digital Services Coordinators
SECTION 2
Competences, coordinated investigation and consistency mechanisms
SECTION 3
European Board for Digital Services
SECTION 4
Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines
SECTION 5
Common provisions on enforcement
SECTION 6
Delegated and implementing acts
CHAPTER V
FINAL PROVISIONS
- information society service
- recipient of the service
- consumer
- to offer services in the Union
- substantial connection to the Union
- trader
- intermediary service
- mere conduit
- caching
- hosting
- illegal content
- online platform
- online search engine
- dissemination to the public
- distance contract
- online interface
- Digital Services Coordinator of establishment
- Digital Services Coordinator of destination
- active recipient of an online platform
- active recipient of an online search engine
- advertisement
- recommender system
- content moderation
- terms and conditions
- persons with disabilities
- commercial communication
- turnover
- Mere conduit
- Caching
- measures 16
- adapting 14
- risks 12
- very 12
- large 12
- systemic 10
- particular 10
- shall 10
- online_search_engines 8
- cooperation 8
- online_platforms 8
- information 8
- including 8
- rights 6
- such 6
- reports 6
- adjusting 6
- through 6
- processes 6
- service 6
- commission 6
- providers 6
- specific 6
- identified 6
- systems 4
- present 4
- mitigation 4
- appropriate 4
- testing 4
- practices 4
- and 4
- best 4
- taking 4
- the 4
- guidelines 4
- targeted 4
- resources 4
- content_moderation 4
- fundamental 4
- which 4
- tools 4
- they 4
- pursuant 4
- to article 4
- aimed 4
- prominent 4
- include 4
- possible 4
- recipients 4
- services 4
Article 35
Mitigation of risks
1. Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
(a) | adapting the design, features or functioning of their services, including their online_interfaces; |
(b) | adapting their terms_and_conditions and their enforcement; |
(c) | adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation; |
(d) | testing and adapting their algorithmic systems, including their recommender_systems; |
(e) | adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide; |
(f) | reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk; |
(g) | initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21; |
(h) | initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively; |
(i) | taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information; |
(j) | taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate; |
(k) | ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information. |
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
(a) | identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42; |
(b) | best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified. |
Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.
3. The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
Article 35
Mitigation of risks
1. Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
(a) | adapting the design, features or functioning of their services, including their online_interfaces; |
(b) | adapting their terms_and_conditions and their enforcement; |
(c) | adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation; |
(d) | testing and adapting their algorithmic systems, including their recommender_systems; |
(e) | adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide; |
(f) | reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk; |
(g) | initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21; |
(h) | initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively; |
(i) | taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information; |
(j) | taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate; |
(k) | ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information. |
2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:
(a) | identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42; |
(b) | best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified. |
Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.
3. The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.
whereas