search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'guidelines' . Output generated live by software developed by IusOnDemand srl


expand index guidelines:

    CHAPTER I
    GENERAL PROVISIONS

    CHAPTER II
    LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICES

    CHAPTER III
    DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

    SECTION 1
    Provisions applicable to all providers of intermediary services

    SECTION 2
    Additional provisions applicable to providers of hosting services, including online platforms

    SECTION 3
    Additional provisions applicable to providers of online platforms
  • 1 Art. 22 Trusted flaggers
  • 1 Art. 25 Online interface design and organisation

  • SECTION 4
    Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders
  • 1 Art. 28 Online protection of minors

  • SECTION 5
    Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
  • 2 Art. 35 Mitigation of risks
  • 1 Art. 39 Additional online advertising transparency

  • SECTION 6
    Other provisions concerning due diligence obligations

    CHAPTER IV
    IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

    SECTION 1
    Competent authorities and national Digital Services Coordinators

    SECTION 2
    Competences, coordinated investigation and consistency mechanisms

    SECTION 3
    European Board for Digital Services
  • 1 Art. 61 European Board for Digital Services

  • SECTION 4
    Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines
  • 1 Art. 63 Tasks of the Board

  • SECTION 5
    Common provisions on enforcement

    SECTION 6
    Delegated and implementing acts

    CHAPTER V
    FINAL PROVISIONS


whereas guidelines:


definitions:


cloud tag: and the number of total unique words without stopwords is: 559

 

Article 22

Trusted flaggers

1.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.

2.   The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

(a)

it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal_content;

(b)

it is independent from any provider of online_platforms;

(c)

it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

3.   Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:

(a)

the identity of the provider of hosting services,

(b)

the type of allegedly illegal_content notified,

(c)

the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

4.   Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.

5.   The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.

6.   Where a provider of online_platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online_platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.

7.   The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online_platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.

8.   The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online_platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.

Article 25

Online interface design and organisation

1.   Providers of online_platforms shall not design, organise or operate their online_interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.

2.   The prohibition in paragraph 1 shall not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679.

3.   The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably:

(a)

giving more prominence to certain choices when asking the recipient_of_the_service for a decision;

(b)

repeatedly requesting that the recipient_of_the_service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience;

(c)

making the procedure for terminating a service more difficult than subscribing to it.

Article 28

Online protection of minors

1.   Providers of online_platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service.

2.   Providers of online_platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient_of_the_service when they are aware with reasonable certainty that the recipient_of_the_service is a minor.

3.   Compliance with the obligations set out in this Article shall not oblige providers of online_platforms to process additional personal data in order to assess whether the recipient_of_the_service is a minor.

4.   The Commission, after consulting the Board, may issue guidelines to assist providers of online_platforms in the application of paragraph 1.

SECTION 4

Additional provisions applicable to providers of online_platforms allowing consumers to conclude distance_contracts with traders

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 39

Additional online advertising transparency

1.   Providers of very large online_platforms or of very large online_search_engines that present advertisements on their online_interfaces shall compile and make publicly available in a specific section of their online_interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online_interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.

2.   The repository shall include at least all of the following information:

(a)

the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;

(b)

the natural or legal person on whose behalf the advertisement is presented;

(c)

the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);

(d)

the period during which the advertisement was presented;

(e)

whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;

(f)

the commercial_communications published on the very large online_platforms and identified pursuant to Article 26(2);

(g)

the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted.

3.   As regards paragraph 2, points (a), (b) and (c), where a provider of very large online_platform or of very large online_search_engine has removed or disabled access to a specific advertisement based on alleged illegality or incompatibility with its terms_and_conditions, the repository shall not include the information referred to in those points. In such case, the repository shall include, for the specific advertisement concerned, the information referred to in Article 17(3), points (a) to (e), or Article 9(2), point (a)(i), as applicable.

The Commission may, after consultation of the Board, the relevant vetted researchers referred to in Article 40 and the public, issue guidelines on the structure, organisation and functionalities of the repositories referred to in this Article.

Article 61

European Board for Digital Services

1.   An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary_services named ‘European Board for Digital Services’ (the ‘Board’) is established.

2.   The Board shall advise the Digital Services Coordinators and the Commission in accordance with this Regulation to achieve the following objectives:

(a)

contributing to the consistent application of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation;

(b)

coordinating and contributing to guidelines and analysis of the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation;

(c)

assisting the Digital Services Coordinators and the Commission in the supervision of very large online_platforms.

Article 63

Tasks of the Board

1.   Where necessary to meet the objectives set out in Article 61(2), the Board shall in particular:

(a)

support the coordination of joint investigations;

(b)

support the competent authorities in the analysis of reports and results of audits of very large online_platforms or of very large online_search_engines to be transmitted pursuant to this Regulation;

(c)

issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation, taking into account, in particular, the freedom to provide services of the providers of intermediary_service;

(d)

advise the Commission on the measures referred to in Article 66 and, adopt opinions concerning very large online_platforms or very large online_search_engines in accordance with this Regulation;

(e)

support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in cooperation with relevant stakeholders as provided for in this Regulation, including by issuing opinions or recommendations on matters related to Article 44, as well as the identification of emerging issues, with regard to matters covered by this Regulation.

2.   Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.

SECTION 4

Supervision, investigation, enforcement and monitoring in respect of providers of very large online_platforms and of very large online_search_engines


whereas









keyboard_arrow_down