ley de servicios digitales
On July 5, 2022, the EU Regulation on the Digital Services Act (DSA) was approved.

The aim of this Act, like the Digital Markets Act (DMA), is to implement the rules that will govern a future European digital single market.

In this way, the fundamental rights of Internet users will be protected, as well as the rights of the users of digital services.

Another objective of this new law is to promote growth, innovation and competitiveness in the digital market, at the same time as allowing the expansion of all those smaller platforms.

Finally, the DSA aims to control and supervise large platforms (those with a significant market quota in their sector). 

Who does the Digital Services Act (DSA) apply to?

It will apply to all information society service providers who, independently of their location, provide services to users residing or established in the EU.

n particular, the DSA applies to providers of intermediary services. These are merely transmission and hosting services, as well as search engines.

Types of service providers

  • Mere transmission service: These are services that consist in the transmission of information provided by the user. It also consists in providing access to a communication network, such as network and infrastructure operators.
  • Caching service: Consists in the temporary, automatic and provisional storage of information. The purpose is to make the networks used more efficient by the receivers of the services.
  • Data hosting service: That storage of all information provided by a user of the service. An example of this is ICloud.
  • Online platform service: Provision of social networking services or marketplaces where receivers can take different actions: Amazon, Ebay, Aliexpress…
  • Large online platform service: Its difference with the previous category resides in the monthly number of active receivers of the service in the EU, which will be equal to or more than 45 million. They may also be considered large by Commission decision. Netflix, Microsoft Office 365…
  • Search motor service: Provision of services with the purpose of offering a web browsing system to the receivers, so that they can carry out consultations. Google, Yahoo…
  • Large online search motor service: It differs from the previous category in the monthly number of active receivers in the EU, which will be equal to or more than 45 million. They may also be considered large by Commission decision.

 

Interpersonal communication services are, as a general rule, excluded from the scope of application of the Digital Services Act (DSA).

However, the Digital Services Act (DSA) may apply to interpersonal communication services which, because of their nature, may have an indeterminate/limited scope of users.

This will be through public groups or open channels, such as those that may be found on online messaging platforms. 

 

Obligations for Intermediary Service Providers

The purpose of these obligations is none other than to improve the transparency and accountability of intermediary service providers.

In this way, the protection of users will be increased.

 

Obligations are classified into:

General obligations:

  • Cooperation with authorities. When they receive an order from the authorities to act against content or request information, they must: 
    • Inform them about the execution of such order.
    •  Inform the user about the order received, indicating the reasons that caused them to take that decision, as well as the possibilities of recourse (if any). 
  •  Designation of legal representative in the EU. If the service provider is not domiciled in the EU, it shall designate a legal representative in a Member State where it offers its service. 
  • Clear and concise service conditions. They must use clear, simple and accessible language. They should include information on the restrictions of use of the service (content moderation procedure).
  • Unique points of contact. Unique points of contact should be established for authorities and service users. This enables them to communicate with the provider easily, directly and quickly.
  •  Information transparency on content moderation. Intermediary service providers have to publish at least annually clear, accessible and legible reports about the content moderation activity carried out, indicating the number of orders and complaints obtained, the Member State from which the order was received, the use of automated tools and, finally, the time taken to inform the authority about their reception and execution. 

 

Obligations for online platforms:

Notification and action:

easily accessible mechanisms must be established, so that users can notify the existence of unlawful content in the service.

Explanation of reasons:

obligation to inform the affected receiver when the provider takes any decision to restrict the contents of its service, as long as its contact data are known. It must explain the reasons for such a decision, as well as the methods implemented and the possibilities of recourse. 

Notification of suspected criminal conduct:

If the provider suspects the existence of crimes that may constitute a threat to the life or safety of individuals, the provider shall notify the police or judicial authorities of the Member State where the crime is committed.

Informative transparency:

Online platforms must include in the reports of intermediary service providers how many litigations have been submitted to judicial resolution and their result, the number of suspensions imposed to their users…

Internal claim processing system:

The online platform must establish an internal system for the reception and processing of complaints that may be interposed by users affected by a decision taken by the provider. 

Extrajudicial resolution of conflicts.

The affected user may go to any independent extrajudicial dispute resolution organ certified by the coordinator of digital services of the corresponding Member State.

Management of warnings emitted by trustable alerters:

There will be trustable alerters, who will be able to emit warnings to the online platforms.

Protection against inappropriate uses (suspension):

Online platforms will have the power to suspend the service to any user who facilitates illegal content frequently.

Publicity:

Online platforms must provide higher transparency to the publicity they show or that is published by their users. This publicity must be clear and identifiable. The platform will not be able to direct minors to advertising content based on profiling.

Recommendation systems:

Platforms must provide a clear and simple explanation of the criteria used to determinate the content that is suggested to users.  The user, on the other hand, will be able to select the way in which the information should be prioritized, through a direct and accessible functionality.

Obligations of online platforms that allow consumers to contact merchants.

In case the online platform allows consumers to contact merchants ( for example marketplaces), it will have to comply with the following additional obligations, in order to protect the consumer:

  •  Traceability: The provider shall request, verify and offer complete and trustable information about the merchant.
  • Compliance from the design: The platform must offer information about its products, services, trademarks and identifying contact data. 
  • Information to consumers: If the provider is aware of the unlawfulness of a product or service offered on its platform, it must inform the consumers who purchase it, in the period of the following 6 months, in case it has their contact data. If it does not have them, it will provide a general notice on its platform. 

 

Obligations for very large online search motors and platforms:

Risk evaluation and reduction:

This evaluation shall be performed annually, and when new functionalities implemented may disturb localized risks (for example, publication of unlawful content). When such risks are detected, reduction or mitigation measures shall be implemented ( for example, adaptation of specific measures that protect the fundamental rights of minors).

Crisis management:

In the occurrence of extraordinary circumstances that create serious threats to public health or safety, the Commission will require large online search motors and platforms to take concrete measures for impact assessment and adaptation of their services and procedures, with the aim of managing the crisis situation. 

Submission to independent annual audits:

Annual audits must be made, and the recommendations made must be adopted, emitting a report in the period of 1 month, applying all those measures recommended by the auditor. 

Access to data by authorities:

The necessary data shall be provided to allow competent authorities to supervise and evaluate compliance. 

Verification of compliance:

An independent internal function for verification of compliance with independent and separate functions should be created.

Supervisory fee:

These fees will be charged annually by the Commission to large online search motors and platforms. The purpose of this will be to cover the costs incurred in connection with the envisaged supervisory functions. 

Publicity:

They shall be required to have a public repository of publicity actions that reports on those carried out on the same online platform. This information must be available for at least one year, since the content was published. 

Informative transparency:

Transparency reports should include information about the average number of active receivers of the online service for each Member State, as well as the results of their risk evaluations and audits. The content moderation procedure established by the DSA should also be included.

Terms of service:

The terms of service should be published in all official languages of the Member States in which they provide their services, as well as providing a clear, simple and unequivocal summary. 

Infringements and sanctions of the Digital Services Act (DSA)

In the case of an infringement of the above-mentioned obligations, intermediary service providers will be subjected to the imposition of serious penalties.

They may also be prohibited to operate in the EU single market, in case of serious and repeated infringements. 

The Digital Services Act (DSA) does not contain a specific sanctioning regime. This will be the responsibility of each Member State.

Sanctions for non-compliance of the obligations contained in the DSA will involve a serious economic consequence for service providers.

This is because their scope will be a maximum of 6% of the overall annual billing of the service provider during the previous fiscal year.

For its part, the service users will have the power to file complaints against the provider, to the coordinator of digital services of the competent Member State.

They may also request compensation from any service provider who has caused loss or damage due to non-compliance.

 The power to impose penalties will be assigned to the digital services coordinator of each Member State.

This coordinating entity will be responsible for supervising intermediary service providers, monitoring compliance with the DSA and imposing sanctions.

It may also work exclusively or cooperatively with other competent authorities. 

 

Entry into effect

The regulation will enter into force 20 days after its publication in the Official Journal.

Fifteen months after its entry into force, from January 1, 2024, the DSA will apply to the entire European Union. 

From this moment, all service platforms will have to act in accordance with the Digital Services Act (DSA).

hey must respect any and all limits imposed, as well as comply with the obligations established.

 

From Auratech we are at your disposal.

 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *