search


keyboard_tab Digital Service Act 2022/2065 EN

BG CS DA DE EL EN ES ET FI FR GA HR HU IT LV LT MT NL PL PT RO SK SL SV print pdf

2022/2065 EN cercato: 'include' . Output generated live by software developed by IusOnDemand srl


expand index include:


whereas include:


definitions:


cloud tag: and the number of total unique words without stopwords is: 1177

 

Article 4

Mere_conduit

1.   Where an information_society_service is provided that consists of the transmission in a communication network of information provided by a recipient_of_the_service, or the provision of access to a communication network, the service provider shall not be liable for the information transmitted or accessed, on condition that the provider:

(a)

does not initiate the transmission;

(b)

does not select the receiver of the transmission; and

(c)

does not select or modify the information contained in the transmission.

2.   The acts of transmission and of provision of access referred to in paragraph 1 shall include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission.

3.   This Article shall not affect the possibility for a judicial or administrative authority, in accordance with a Member State’s legal system, to require the service provider to terminate or prevent an infringement.

Article 9

Orders to act against illegal_content

1.   Upon the receipt of an order to act against one or more specific items of illegal_content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union law or national law in compliance with Union law, providers of intermediary_services shall inform the authority issuing the order, or any other authority specified in the order, of any effect given to the order without undue delay, specifying if and when effect was given to the order.

2.   Member States shall ensure that when an order referred to in paragraph 1 is transmitted to the provider, it meets at least the following conditions:

(a)

that order contains the following elements:

(i)

a reference to the legal basis under Union or national law for the order;

(ii)

a statement of reasons explaining why the information is illegal_content, by reference to one or more specific provisions of Union law or national law in compliance with Union law;

(iii)

information identifying the issuing authority;

(iv)

clear information enabling the provider of intermediary_services to identify and locate the illegal_content concerned, such as one or more exact URL and, where necessary, additional information;

(v)

information about redress mechanisms available to the provider of intermediary_services and to the recipient_of_the_service who provided the content;

(vi)

where applicable, information about which authority is to receive the information about the effect given to the orders;

(b)

the territorial scope of that order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, is limited to what is strictly necessary to achieve its objective;

(c)

that order is transmitted in one of the languages declared by the provider of intermediary_services pursuant to Article 11(3) or in another official language of the Member States, agreed between the authority issuing the order and that provider, and is sent to the electronic point of contact designated by that provider, in accordance with Article 11; where the order is not drafted in the language declared by the provider of intermediary_services or in another bilaterally agreed language, the order may be transmitted in the language of the authority issuing the order, provided that it is accompanied by a translation into such declared or bilaterally agreed language of at least the elements set out in points (a) and (b) of this paragraph.

3.   The authority issuing the order or, where applicable, the authority specified therein, shall transmit it, along with any information received from the provider of intermediary_services concerning the effect given to that order to the Digital Services Coordinator from the Member State of the issuing authority.

4.   After receiving the order from the judicial or administrative authority, the Digital Services Coordinator of the Member State concerned shall, without undue delay, transmit a copy of the order referred to in paragraph 1 of this Article to all other Digital Services Coordinators through the system established in accordance with Article 85.

5.   At the latest when effect is given to the order or, where applicable, at the time provided by the issuing authority in its order, providers of intermediary_services shall inform the recipient_of_the_service concerned of the order received and to the effect given to it. Such information provided to the recipient_of_the_service shall include a statement of reasons, the possibilities for redress that exist, and a description of the territorial scope of the order, in accordance with paragraph 2.

6.   The conditions and requirements laid down in this Article shall be without prejudice to national civil and criminal procedural law.

Article 10

Orders to provide information

1.   Upon receipt of an order to provide specific information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union law or national law in compliance with Union law, providers of intermediary_services shall, without undue delay inform the authority issuing the order, or any other authority specified in the order, of its receipt and of the effect given to the order, specifying if and when effect was given to the order.

2.   Member States shall ensure that when an order referred to in paragraph 1 is transmitted to the provider, it meets at least the following conditions:

(a)

that order contains the following elements:

(i)

a reference to the legal basis under Union or national law for the order;

(ii)

information identifying the issuing authority;

(iii)

clear information enabling the provider of intermediary_services to identify the specific recipient or recipients on whom information is sought, such as one or more account names or unique identifiers;

(iv)

a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary_services with applicable Union law or national law in compliance with Union law, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences;

(v)

information about redress mechanisms available to the provider and to the recipients of the service concerned;

(vi)

where applicable, information about which authority is to receive the information about the effect given to the orders;

(b)

that order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control;

(c)

that order is transmitted in one of the languages declared by the provider of intermediary_services pursuant to Article 11(3) or in another official language of the Member States, agreed between the authority issuing the order and the provider, and is sent to the electronic point of contact designated by that provider, in accordance with Article 11; where the order is not drafted in the language declared by the provider of intermediary_services or in another bilaterally agreed language, the order may be transmitted in the language of the authority issuing the order, provided that it is accompanied by a translation into such declared or bilaterally agreed language of at least the elements set out in points (a) and (b) of this paragraph.

3.   The authority issuing the order or, where applicable, the authority specified therein, shall transmit it, along with any information received from the provider of intermediary_services concerning the effect given to that order to the Digital Services Coordinator from the Member State of the issuing authority.

4.   After receiving the order from the judicial or administrative authority, the Digital Services Coordinator of the Member State concerned shall, without undue delay, transmit a copy of the order referred to in paragraph 1 of this Article to all Digital Services Coordinators through the system established in accordance with Article 85.

5.   At the latest when effect is given to the order, or, where applicable, at the time provided by the issuing authority in its order, providers of intermediary_services shall inform the recipient_of_the_service concerned of the order received and the effect given to it. Such information provided to the recipient_of_the_service shall include a statement of reasons and the possibilities for redress that exist, in accordance with paragraph 2.

6.   The conditions and requirements laid down in this Article shall be without prejudice to national civil and criminal procedural law.

CHAPTER III

DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENT

SECTION 1

Provisions applicable to all providers of intermediary_services

Article 11

Points of contact for Member States’ authorities, the Commission and the Board

1.   Providers of intermediary_services shall designate a single point of contact to enable them to communicate directly, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 61 for the application of this Regulation.

2.   Providers of intermediary_services shall make public the information necessary to easily identify and communicate with their single points of contact. That information shall be easily accessible, and shall be kept up to date.

3.   Providers of intermediary_services shall specify in the information referred to in paragraph 2 the official language or languages of the Member States which, in addition to a language broadly understood by the largest possible number of Union citizens, can be used to communicate with their points of contact, and which shall include at least one of the official languages of the Member State in which the provider of intermediary_services has its main establishment or where its legal representative resides or is established.

Article 14

Terms and conditions

1.   Providers of intermediary_services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms_and_conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content_moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. It shall be set out in clear, plain, intelligible, user-friendly and unambiguous language, and shall be publicly available in an easily accessible and machine-readable format.

2.   Providers of intermediary_services shall inform the recipients of the service of any significant change to the terms_and_conditions.

3.   Where an intermediary_service is primarily directed at minors or is predominantly used by them, the provider of that intermediary_service shall explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand.

4.   Providers of intermediary_services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

5.   Providers of very large online_platforms and of very large online_search_engines shall provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms_and_conditions, including the available remedies and redress mechanisms, in clear and unambiguous language.

6.   Very large online_platforms and very large online_search_engines within the meaning of Article 33 shall publish their terms_and_conditions in the official languages of all the Member States in which they offer their services.

Article 15

Transparency reporting obligations for providers of intermediary_services

1.   Providers of intermediary_services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content_moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a)

for providers of intermediary_services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal_content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b)

for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal_content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms_and_conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c)

for providers of intermediary_services, meaningful and comprehensible information about the content_moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content_moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal_content or violation of the terms_and_conditions of the service provider, by the detection method and by the type of restriction applied;

(d)

for providers of intermediary_services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms_and_conditions and additionally, for providers of online_platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e)

any use made of automated means for the purpose of content_moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

2.   Paragraph 1 of this Article shall not apply to providers of intermediary_services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online_platforms within the meaning of Article 33 of this Regulation.

3.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article, including harmonised reporting periods. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

SECTION 2

Additional provisions applicable to providers of hosting services, including online_platforms

Article 16

Notice and action mechanisms

1.   Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal_content. Those mechanisms shall be easy to access and user-friendly, and shall allow for the submission of notices exclusively by electronic means.

2.   The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services shall take the necessary measures to enable and to facilitate the submission of notices containing all of the following elements:

(a)

a sufficiently substantiated explanation of the reasons why the individual or entity alleges the information in question to be illegal_content;

(b)

a clear indication of the exact electronic location of that information, such as the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal_content adapted to the type of content and to the specific type of hosting service;

(c)

the name and email address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU;

(d)

a statement confirming the bona fide belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete.

3.   Notices referred to in this Article shall be considered to give rise to actual knowledge or awareness for the purposes of Article 6 in respect of the specific item of information concerned where they allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.

4.   Where the notice contains the electronic contact information of the individual or entity that submitted it, the provider of hosting services shall, without undue delay, send a confirmation of receipt of the notice to that individual or entity.

5.   The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision.

6.   Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 5.

Article 22

Trusted flaggers

1.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay.

2.   The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

(a)

it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal_content;

(b)

it is independent from any provider of online_platforms;

(c)

it carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

3.   Trusted flaggers shall publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report shall list at least the number of notices categorised by:

(a)

the identity of the provider of hosting services,

(b)

the type of allegedly illegal_content notified,

(c)

the action taken by the provider.

Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence.

Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.

4.   Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and email addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2 or whose trusted flagger status they have suspended in accordance with paragraph 6 or revoked in accordance with paragraph 7.

5.   The Commission shall publish the information referred to in paragraph 4 in a publicly available database, in an easily accessible and machine-readable format, and shall keep the database up to date.

6.   Where a provider of online_platforms has information indicating that a trusted flagger has submitted a significant number of insufficiently precise, inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 16, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 20(4), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. Upon receiving the information from the provider of online_platforms, and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. That investigation shall be carried out without undue delay.

7.   The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received from third parties, including the information provided by a provider of online_platforms pursuant to paragraph 6, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger.

8.   The Commission, after consulting the Board, shall, where necessary, issue guidelines to assist providers of online_platforms and Digital Services Coordinators in the application of paragraphs 2, 6 and 7.

Article 23

Measures and protection against misuse

1.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal_content.

2.   Providers of online_platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 16 and 20, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded.

3.   When deciding on suspension, providers of online_platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether the recipient_of_the_service, the individual, the entity or the complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of online_platforms. Those circumstances shall include at least the following:

(a)

the absolute numbers of items of manifestly illegal_content or manifestly unfounded notices or complaints, submitted within a given time frame;

(b)

the relative proportion thereof in relation to the total number of items of information provided or notices submitted within a given time frame;

(c)

the gravity of the misuses, including the nature of illegal_content, and of its consequences;

(d)

where it is possible to identify it, the intention of the recipient_of_the_service, the individual, the entity or the complainant.

4.   Providers of online_platforms shall set out, in a clear and detailed manner, in their terms_and_conditions their policy in respect of the misuse referred to in paragraphs 1 and 2, and shall give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 24

Transparency reporting obligations for providers of online_platforms

1.   In addition to the information referred to in Article 15, providers of online_platforms shall include in the reports referred to in that Article information on the following:

(a)

the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online_platform implemented the decisions of the body;

(b)

the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal_content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

2.   By 17 February 2023 and at least once every six months thereafter, providers shall publish for each online_platform or online_search_engine, in a publicly available section of their online_interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted.

3.   Providers of online_platforms or of online_search_engines shall communicate to the Digital_Services_Coordinator_of_establishment and the Commission, upon their request and without undue delay, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online_platform or of the online_search_engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data.

4.   When the Digital_Services_Coordinator_of_establishment has reasons to consider, based the information received pursuant to paragraphs 2 and 3 of this Article, that a provider of online_platforms or of online_search_engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it shall inform the Commission thereof.

5.   Providers of online_platforms shall, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online_platforms shall ensure that the information submitted does not contain personal data.

6.   The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1 of this Article. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88.

Article 27

Recommender system transparency

1.   Providers of online_platforms that use recommender_systems shall set out in their terms_and_conditions, in plain and intelligible language, the main parameters used in their recommender_systems, as well as any options for the recipients of the service to modify or influence those main parameters.

2.   The main parameters referred to in paragraph 1 shall explain why certain information is suggested to the recipient_of_the_service. They shall include, at least:

(a)

the criteria which are most significant in determining the information suggested to the recipient_of_the_service;

(b)

the reasons for the relative importance of those parameters.

3.   Where several options are available pursuant to paragraph 1 for recommender_systems that determine the relative order of information presented to recipients of the service, providers of online_platforms shall also make available a functionality that allows the recipient_of_the_service to select and to modify at any time their preferred option. That functionality shall be directly and easily accessible from the specific section of the online_platform’s online_interface where the information is being prioritised.

Article 34

Risk assessment

1.   Providers of very large online_platforms and of very large online_search_engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.

They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

(a)

the dissemination of illegal_content through their services;

(b)

any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;

(c)

any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;

(d)

any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.

2.   When conducting risk assessments, providers of very large online_platforms and of very large online_search_engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:

(a)

the design of their recommender_systems and any other relevant algorithmic system;

(b)

their content_moderation systems;

(c)

the applicable terms_and_conditions and their enforcement;

(d)

systems for selecting and presenting advertisements;

(e)

data related practices of the provider.

The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal_content and of information that is incompatible with their terms_and_conditions.

The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

3.   Providers of very large online_platforms and of very large online_search_engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital_Services_Coordinator_of_establishment.

Article 35

Mitigation of risks

1.   Providers of very large online_platforms and of very large online_search_engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:

(a)

adapting the design, features or functioning of their services, including their online_interfaces;

(b)

adapting their terms_and_conditions and their enforcement;

(c)

adapting content_moderation processes, including the speed and quality of processing notices related to specific types of illegal_content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content_moderation;

(d)

testing and adapting their algorithmic systems, including their recommender_systems;

(e)

adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of advertisements in association with the service they provide;

(f)

reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;

(g)

initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;

(h)

initiating or adjusting cooperation with other providers of online_platforms or of online_search_engines through the codes of conduct and the crisis protocols referred to in Articles 45 and 48 respectively;

(i)

taking awareness-raising measures and adapting their online_interface in order to give recipients of the service more information;

(j)

taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;

(k)

ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online_interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.

2.   The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year. The reports shall include the following:

(a)

identification and assessment of the most prominent and recurrent systemic risks reported by providers of very large online_platforms and of very large online_search_engines or identified through other information sources, in particular those provided in compliance with Articles 39, 40 and 42;

(b)

best practices for providers of very large online_platforms and of very large online_search_engines to mitigate the systemic risks identified.

Those reports shall present systemic risks broken down by the Member States in which they occurred and in the Union as a whole, as applicable.

3.   The Commission, in cooperation with the Digital Services Coordinators, may issue guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations.

Article 37

Independent audit

1.   Providers of very large online_platforms and of very large online_search_engines shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following:

(a)

the obligations set out in Chapter III;

(b)

any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48.

2.   Providers of very large online_platforms and of very large online_search_engines shall afford the organisations carrying out the audits pursuant to this Article the cooperation and assistance necessary to enable them to conduct those audits in an effective, efficient and timely manner, including by giving them access to all relevant data and premises and by answering oral or written questions. They shall refrain from hampering, unduly influencing or undermining the performance of the audit.

Such audits shall ensure an adequate level of confidentiality and professional secrecy in respect of the information obtained from the providers of very large online_platforms and of very large online_search_engines and third parties in the context of the audits, including after the termination of the audits. However, complying with that requirement shall not adversely affect the performance of the audits and other provisions of this Regulation, in particular those on transparency, supervision and enforcement. Where necessary for the purpose of the transparency reporting pursuant to Article 42(4), the audit report and the audit implementation report referred to in paragraphs 4 and 6 of this Article shall be accompanied with versions that do not contain any information that could reasonably be considered to be confidential.

3.   Audits performed pursuant to paragraph 1 shall be performed by organisations which:

(a)

are independent from, and do not have any conflicts of interest with, the provider of very large online_platforms or of very large online_search_engines concerned and any legal person connected to that provider; in particular:

(i)

have not provided non-audit services related to the matters audited to the provider of very large online_platform or of very large online_search_engine concerned and to any legal person connected to that provider in the 12 months’ period before the beginning of the audit and have committed to not providing them with such services in the 12 months’ period after the completion of the audit;

(ii)

have not provided auditing services pursuant to this Article to the provider of very large online_platform or of very large online_search_engine concerned and any legal person connected to that provider during a period longer than 10 consecutive years;

(iii)

are not performing the audit in return for fees which are contingent on the result of the audit;

(b)

have proven expertise in the area of risk management, technical competence and capabilities;

(c)

have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards.

4.   Providers of very large online_platforms and of very large online_search_engines shall ensure that the organisations that perform the audits establish an audit report for each audit. That report shall be substantiated, in writing, and shall include at least the following:

(a)

the name, address and the point of contact of the provider of the very large online_platform or of the very large online_search_engine subject to the audit and the period covered;

(b)

the name and address of the organisation or organisations performing the audit;

(c)

a declaration of interests;

(d)

a description of the specific elements audited, and the methodology applied;

(e)

a description and a summary of the main findings drawn from the audit;

(f)

a list of the third parties consulted as part of the audit;

(g)

an audit opinion on whether the provider of the very large online_platform or of the very large online_search_engine subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, namely ‘positive’, ‘positive with comments’ or ‘negative’;

(h)

where the audit opinion is not ‘positive’, operational recommendations on specific measures to achieve compliance and the recommended timeframe to achieve compliance.

5.   Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited.

6.   Providers of very large online_platforms or of very large online_search_engines receiving an audit report that is not ‘positive’ shall take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified.

7.   The Commission is empowered to adopt delegated acts in accordance with Article 87 to supplement this Regulation by laying down the necessary rules for the performance of the audits pursuant to this Article, in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts shall take into account any voluntary auditing standards referred to in Article 44(1), point (e).

Article 39

Additional online advertising transparency

1.   Providers of very large online_platforms or of very large online_search_engines that present advertisements on their online_interfaces shall compile and make publicly available in a specific section of their online_interface, through a searchable and reliable tool that allows multicriteria queries and through application programming interfaces, a repository containing the information referred to in paragraph 2, for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online_interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete.

2.   The repository shall include at least all of the following information:

(a)

the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;

(b)

the natural or legal person on whose behalf the advertisement is presented;

(c)

the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);

(d)

the period during which the advertisement was presented;

(e)

whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;

(f)

the commercial_communications published on the very large online_platforms and identified pursuant to Article 26(2);

(g)

the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted.

3.   As regards paragraph 2, points (a), (b) and (c), where a provider of very large online_platform or of very large online_search_engine has removed or disabled access to a specific advertisement based on alleged illegality or incompatibility with its terms_and_conditions, the repository shall not include the information referred to in those points. In such case, the repository shall include, for the specific advertisement concerned, the information referred to in Article 17(3), points (a) to (e), or Article 9(2), point (a)(i), as applicable.

The Commission may, after consultation of the Board, the relevant vetted researchers referred to in Article 40 and the public, issue guidelines on the structure, organisation and functionalities of the repositories referred to in this Article.

Article 42

Transparency reporting obligations

1.   Providers of very large online_platforms or of very large online_search_engines shall publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months.

2.   The reports referred to in paragraph 1 of this Article published by providers of very large online_platforms shall, in addition to the information referred to in Article 15 and Article 24(1), specify:

(a)

the human resources that the provider of very large online_platforms dedicates to content_moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

(b)

the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff;

(c)

the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports shall be published in at least one of the official languages of the Member States.

3.   In addition to the information referred to in Articles 24(2), the providers of very large online_platforms or of very large online_search_engines shall include in the reports referred to in paragraph 1 of this Article the information on the average monthly recipients of the service for each Member State.

4.   Providers of very large online_platforms or of very large online_search_engines shall transmit to the Digital_Services_Coordinator_of_establishment and the Commission, without undue delay upon completion, and make publicly available at the latest three months after the receipt of each audit report pursuant to Article 37(4):

(a)

a report setting out the results of the risk assessment pursuant to Article 34;

(b)

the specific mitigation measures put in place pursuant to Article 35(1);

(c)

the audit report provided for in Article 37(4);

(d)

the audit implementation report provided for in Article 37(6);

(e)

where applicable, information about the consultations conducted by the provider in support of the risk assessments and design of the risk mitigation measures.

5.   Where a provider of very large online_platform or of very large online_search_engine considers that the publication of information pursuant to paragraph 4 might result in the disclosure of confidential information of that provider or of the recipients of the service, cause significant vulnerabilities for the security of its service, undermine public security or harm recipients, the provider may remove such information from the publicly available reports. In that case, the provider shall transmit the complete reports to the Digital_Services_Coordinator_of_establishment and the Commission, accompanied by a statement of the reasons for removing the information from the publicly available reports.

Article 48

Crisis protocols

1.   The Board may recommend that the Commission initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations. Those situations shall be strictly limited to extraordinary circumstances affecting public security or public health.

2.   The Commission shall encourage and facilitate the providers of very large online_platforms, of very large online_search_engines and, where appropriate, the providers of other online_platforms or of other online_search_engines, to participate in the drawing up, testing and application of those crisis protocols. The Commission shall aim to ensure that those crisis protocols include one or more of the following measures:

(a)

prominently displaying information on the crisis situation provided by Member States’ authorities or at Union level, or, depending on the context of the crisis, by other relevant reliable bodies;

(b)

ensuring that the provider of intermediary_services designates a specific point of contact for crisis management; where relevant, this may be the electronic point of contact referred to in Article 11 or, in the case of providers of very large online_platforms or of very large online_search_engines, the compliance officer referred to in Article 41;

(c)

where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 16, 20, 22, 23 and 35 to the needs arising from the crisis situation.

3.   The Commission shall, as appropriate, involve Member States’ authorities, and may also involve Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols.

4.   The Commission shall aim to ensure that the crisis protocols set out clearly all of the following:

(a)

the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues;

(b)

the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated;

(c)

a clear procedure for determining when the crisis protocol is to be activated;

(d)

a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned;

(e)

safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination;

(f)

a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation.

5.   If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in paragraph 4, point (e), it shall request the participants to revise the crisis protocol, including by taking additional measures.

CHAPTER IV

IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENT

SECTION 1

Competent authorities and national Digital Services Coordinators

Article 55

Activity reports

1.   Digital Services Coordinators shall draw up annual reports on their activities under this Regulation, including the number of complaints received pursuant to Article 53 and an overview of their follow-up. The Digital Services Coordinators shall make the annual reports available to the public in a machine-readable format, subject to the applicable rules on the confidentiality of information pursuant to Article 84, and shall communicate them to the Commission and to the Board.

2.   The annual report shall also include the following information:

(a)

the number and subject matter of orders to act against illegal_content and orders to provide information issued in accordance with Articles 9 and 10 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned;

(b)

the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 9 and 10.

3.   Where a Member State has designated several competent authorities pursuant to Article 49, it shall ensure that the Digital Services Coordinator draws up a single report covering the activities of all competent authorities and that the Digital Services Coordinator receives all relevant information and support needed to that effect from the other competent authorities concerned.

SECTION 2

Competences, coordinated investigation and consistency mechanisms

Article 57

Mutual assistance

1.   Digital Services Coordinators and the Commission shall cooperate closely and provide each other with mutual assistance in order to apply this Regulation in a consistent and efficient manner. Mutual assistance shall include, in particular, exchange of information in accordance with this Article and the duty of the Digital_Services_Coordinator_of_establishment to inform all Digital Services Coordinators of destination, the Board and the Commission about the opening of an investigation and the intention to take a final decision, including its assessment, in respect of a specific provider of intermediary_services.

2.   For the purpose of an investigation, the Digital_Services_Coordinator_of_establishment may request other Digital Services Coordinators to provide specific information in their possession as regards a specific provider of intermediary_services or to exercise their investigative powers referred to in Article 51(1) with regard to specific information located in their Member State. Where appropriate, the Digital Services Coordinator receiving the request may involve other competent authorities or other public authorities of the Member State in question.

3.   The Digital Services Coordinator receiving the request pursuant to paragraph 2 shall comply with such request and inform the Digital_Services_Coordinator_of_establishment about the action taken, without undue delay and no later than two months after its receipt, unless:

(a)

the scope or the subject matter of the request is not sufficiently specified, justified or proportionate in view of the investigative purposes; or

(b)

neither the requested Digital Service Coordinator nor other competent authority or other public authority of that Member State is in possession of the requested information nor can have access to it; or

(c)

the request cannot be complied with without infringing Union or national law.

The Digital Services Coordinator receiving the request shall justify its refusal by submitting a reasoned reply, within the period set out in the first subparagraph.

Article 59

Referral to the Commission

1.   In the absence of a communication within the period laid down in Article 58(5), in the case of a disagreement of the Board with the assessment or the measures taken or envisaged pursuant to Article 58(5) or in the cases referred to in Article 60(3), the Board may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital_Services_Coordinator_of_establishment, the assessment by that Digital Services Coordinator, the reasons for the disagreement and any additional information supporting the referral.

2.   The Commission shall assess the matter within two months following the referral of the matter pursuant to paragraph 1, after having consulted the Digital_Services_Coordinator_of_establishment.

3.   Where, pursuant to paragraph 2 of this Article, the Commission considers that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to Article 58(5) are insufficient to ensure effective enforcement or otherwise incompatible with this Regulation, it shall communicate its views to the Digital_Services_Coordinator_of_establishment and the Board and request the Digital_Services_Coordinator_of_establishment to review the matter.

The Digital_Services_Coordinator_of_establishment shall take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, taking utmost account of the views and request for review by the Commission. The Digital_Services_Coordinator_of_establishment shall inform the Commission, as well as the requesting Digital Services Coordinator or the Board that took action pursuant to Article 58(1) or (2), about the measures taken within two months from that request for review.

Article 72

Monitoring actions

1.   For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by providers of the very large online_platform and of the very large online_search_engines. The Commission may order them to provide access to, and explanations relating to, its databases and algorithms. Such actions may include, imposing an obligation on the provider of the very large online_platform or of the very large online_search_engine to retain all documents deemed to be necessary to assess the implementation of and compliance with the obligations under this Regulation.

2.   The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors, as well as experts and auditors from competent national authorities with the agreement of the authority concerned, to assist the Commission in monitoring the effective implementation and compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the Commission.

Article 75

Enhanced supervision of remedies to address infringements of obligations laid down in Section 5 of Chapter III

1.   When adopting a decision pursuant to Article 73 in relation to an infringement by a provider of a very large online_platform or of a very large online_search_engine of any of the provisions of Section 5 of Chapter III, the Commission shall make use of the enhanced supervision system laid down in this Article. When doing so, it shall take utmost account of any opinion of the Board pursuant to this Article.

2.   In the decision referred to in Article 73, the Commission shall require the provider of a very large online_platform or of a very large online_search_engine concerned to draw up and communicate, within a reasonable period specified in the decision, to the Digital Services Coordinators, the Commission and the Board an action plan setting out the necessary measures which are sufficient to terminate or remedy the infringement. Those measures shall include a commitment to perform an independent audit in accordance with Article 37(3) and (4) on the implementation of the other measures, and shall specify the identity of the auditors, as well as the methodology, timing and follow-up of the audit. The measures may also include, where appropriate, a commitment to participate in a relevant code of conduct, as provided for in Article 45.

3.   Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Commission. Within one month following receipt of that opinion, the Commission shall decide whether the measures set out in the action plan are sufficient to terminate or remedy the infringement, and shall set a reasonable period for its implementation. The possible commitment to adhere to relevant codes of conduct shall be taken into account in that decision. The Commission shall subsequently monitor the implementation of the action plan. To that end, the provider of a very large online_platform or of a very large online_search_engine concerned shall communicate the audit report to the Commission without undue delay after it becomes available, and shall keep the Commission up to date on steps taken to implement the action plan. The Commission may, where necessary for such monitoring, require the provider of a very large online_platform or of a very large online_search_engine concerned to provide additional information within a reasonable period set by the Commission.

The Commission shall keep the Board and the Digital Services Coordinators informed about the implementation of the action plan, and about its monitoring thereof.

4.   The Commission may take necessary measures in accordance with this Regulation, in particular Article 76(1), point (e), and Article 82(1), where:

(a)

the provider of the very large online_platform or of the very large online_search_engine concerned fails to provide any action plan, the audit report, the necessary updates or any additional information required, within the applicable period;

(b)

the Commission rejects the proposed action plan because it considers that the measures set out therein are insufficient to terminate or remedy the infringement; or

(c)

the Commission considers, on the basis of the audit report, any updates or additional information provided or any other relevant information available to it, that the implementation of the action plan is insufficient to terminate or remedy the infringement.

Article 77

Limitation period for the imposition of penalties

1.   The powers conferred on the Commission by Articles 74 and 76 shall be subject to a limitation period of five years.

2.   Time shall begin to run on the day on which the infringement is committed. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases.

3.   Any action taken by the Commission or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following:

(a)

requests for information by the Commission or by a Digital Services Coordinator;

(b)

inspection;

(c)

the opening of a proceeding by the Commission pursuant to Article 66(1).

4.   Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period has been suspended pursuant to paragraph 5.

5.   The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the Commission is the subject of proceedings pending before the Court of Justice of the European Union.

Article 86

Representation

1.   Without prejudice to Directive (EU) 2020/1828 or to any other type of representation under national law, recipients of intermediary_services shall at least have the right to mandate a body, organisation or association to exercise the rights conferred by this Regulation on their behalf, provided the body, organisation or association meets all of the following conditions:

(a)

it operates on a not-for-profit basis;

(b)

it has been properly constituted in accordance with the law of a Member State;

(c)

its statutory objectives include a legitimate interest in ensuring that this Regulation is complied with.

2.   Providers of online_platforms shall take the necessary technical and organisational measures to ensure that complaints submitted by bodies, organisations or associations referred to in paragraph 1 of this Article on behalf of recipients of the service through the mechanisms referred to in Article 20(1) are processed and decided upon with priority and without undue delay.

SECTION 6

Delegated and implementing acts


whereas









keyboard_arrow_down