3. DEVELOPMENT OF DIGITAL SERVICES
A FEW BIG ONES MANY SMALL ONES
EU-Kommission via Dealroom database:
ca. 10.000 are based in the EU , of which 96 % SMEs and around
half micro enterprises.
Dealroom database,infra, S.24
4. DEVELOPMENT OF DIGITAL SERVICES
E-Commerce Directice (2000):
Rules for all services of the
information society
Digital Services Act:
Liability rules (=ECD)
Notice and complaint
mechanisms
Due diligence obligations
Harmonised enforcement
(Digital Services Coordinators+
EU -Commission)
5. GOALS
1. To make it easier to offer cross-border digital
services and to create uniform rules for the
European internal market.
2. Make the internet and digital services a
safer place where national law applies and
everyone in the EU can freely exercise
their fundamental rights, in particular
their right to freedom of expression and
information.
3. Combat disinformation and hate
speech. The functioning of platforms that
damage democratic societies are being
considered.
6. SCOPE
The DSA applies to digital services offered to users located
in the EU, regardless of the place of establishment of the
services themselves.
It applies with tiered rules to
intermediary services
hosting services
online platforms
Very Large Online Platforms (VLOPs) and Search Engines
(VLOSEs)
7. SCOPE
Intermediary services
Hosting services
Online platforms
Very large online platforms and search engines (VLOPs and VLOSEs)
9. Liability rules
Reporting Illegal Content and Remedial Procedures
Transparency reports for online platforms
Ban on manipulative practices (dark patterns)
Restrictions on behavioral advertising
Exceptions for small and medium-sized companies
VLOPs/VLOSEs:
Risk assessment and mitigation
Audits (independent verification)
Recommender systems
Advertising archives
Data access for researchers and CSOs
Enforcement:
based on country of origin principle
The EU Commission is responsible for VLOPs
Fines of up to 6% of global annual sales
CONTENTS
10. LIABILITY
Services are not liable for user-generated content
until they have actual knowledge of it and then "act
expeditiously" to remove content (Articles 4-6)
General monitoring (Art. 8):
“No general obligation to monitor the information which
providers of intermediary services transmit or store, nor actively
to seek facts or circumstances indicating illegal activity shall be
imposed on those providers“
11. ORDERS
Orders to take action against illegal content and
information orders from authorities (Articles 9-10):
Harmonised and clear rules.
In future, orders from the judicial and
administrative authorities of the Member States
against illegal content and requests for information
from all online services must be acted on
expeditiously.
12. POINTS OF CONTACT & LEGAL
REPRESENTATIVES
Services designate a single point of contact so
that they can communicate electronically directly
with users, authorities and the Commission
(Articles 11-12)
Services that do not have an establishment in the
EU appoint a legal or natural person to act as
their legal representative (Article 13).
13. TERMS & CONDITIONS and TRANSPARENCY
T&Cs and community standards that establish rules for
content moderation must be applied in an objective and
non-arbitrary way in the future (Article 14).
Services are obliged to respect the fundamental rights of
their users when applying their terms and conditions
(Article 14).
Annual accountability for content moderation must be
provided in machine-readable transparency reports
(Article 15).
Very large platforms must also disclose how many staff
are used for content moderation and how the staff is
trained and supported (Article 42).
14. NOTICES AND COMPLAINTS
Europe-wide harmonised notification procedures
to quickly and easily report potentially illegal
content on the Internet (Article 16).
Users finally get more rights via:
Justification and complaints procedures at the
platforms (Articles 17 and 20)
External dispute resolution procedures (Art. 21) so that
users can defend themselves against unlawful deletions
and account blocks.
15. ONLINE MARKETPLACES
Illegal activities on online marketplaces are
frequently uncovered, especially when it comes to
selling dangerous products. Too often the identity of
the providers is not verified.
DSA:
Dedicated chapter for marketplaces (Section 4)
Traceability of traders on platforms (duty to provide
information: name, contact details, registration numbers,
etc., Art. 29-32)
Online marketplaces must use their "best efforts" to check
whether the information has been provided reliably and
completely
16. RECOMMENDER SYSTEMS
Since the revelations by Facebook whistleblower Frances
Haugen, we have evidence of the negative impact of
online platforms' recommendER systems: Algorithms
amplify the distribution of polarising content.
DSA:
Online platforms must explain how their recommendation
systems work, why certain content is displayed and how users
can change this (Article 27).
Very large platforms must also offer an option to view the
content that is not personalized and not based on profiling
(Article 38).
17. MANIPULATIVE PRACTICES
"Dark Patterns" push users in unfair and manipulative manner to make
decisions every day, such as:
Cookie banners that hide the option to refuse cookies.
Repeated harassment after users have already opted out.
Hiding "unsubscribe" options or how to close an account
Article 25, DSA:
Websites and buttons must be designed fairly so that users have a real
choice in the future.
But: Practices already covered by existing consumer protection and
data protection legislation are not included in this prohibition.
But: no binding list of practices in the article, as provided for in the
parliamentary text. The EU Commission can publish guidelines to
explain specific prohibited practices.
18. SURVEILLANCE ADVERTISING
Prohibition of advertising based on profiling
which uses sensitive data - e.g. political and
sexual orientation, trade union membership,
religion… (Article 26(3))
Prohibition of advertising based on profiling
using personal data of minors (Article 28(2))
19. RULES FOR VLOPs, 45mio USERS
Risk assessment and mitigation (Art. 34-35)
Recommendation systems with alternative option not
based on profiling (Art. 27/38)
Crisis management (Art. 36)
Audits (independent review, Art. 37)
Ad archives for VLOPs (Article 39)
Access to platform data for researchers and civil
society (Art.40)
VLOPs supervisory fee (Article 43)
20. RISK ASSESSMENT & MITIGATION
VLOPs are required to conduct annual risk assessments
(Article 34): how design and algorithmic systems
impact:
fundamental rights, human dignity, data protection,
diversity of opinion and media, non-discrimination,
youth protection and consumer protection?
the public discourse and elections, on gender-
specific violence or on the mental and physical
health of the users?
VLOPs are required to address the identified risks
(Article 35).
21. CRISIS MANAGEMENT
The EU Commission can, on the recommendation of
the DSA Board (consisting of the 27 MS), require
actions from VLOPs, e.g. carry out specific risk
analyses or take measures
This crisis response must be limited in time (max. 3
months)
The Commission's decision must be made public
22. ACCESS TO PLATFORM DATA
Access to platform data for researchers and civil
society has been secured (Article 40).
VLOPs and VLOSEs have to explain the design, logic
and functioning of their algorithmic systems,
including their recommendation systems, at the
request of the authorities in the MS or the
Commission.
23. ENFORCEMENT
National Level -
Digital Service
Coordinators (DSCs)
• Independent
Authorities Direct
Oversight and
Enforcement
Coordination with
other national
authorities
• Coordination with
other national
authorities
EU Board for Digital
Services (Art. 61)
• Ad hoc independent
advisory board
• Composition from
DSCs
• Presidency of the
EU Commission
• Advises DSCs and
com, proposes
EU Commission
• Direct oversight and
enforcement for
VLOPs
• Advises on cross-
border cases
• Acts upon DSC
request
24. ENFORCEMENT
We learned from the GDPR: Avoiding an “enforcement
bottleneck”
Regulators have strong enforcement measures at their disposal
(Articles 49 ff.)
Penalties of up to 6% of global sales (Article 52)
Periodic fines of up to 5% may be imposed for persistent
violations (Art. 52)
(Temporary) measures to stop the violation can be ordered or, in
special situations, the service can even be completely blocked
(Article 70).
EU Commission has centralized enforcement power for VLOPs.
VLOPs share the financial burden of their own supervision through
fees.
25. ENFORCEMENT IN GERMANY
The draft law to implement the DSA in Germany
published beg. August 2023:
designates the BNetzA as the national DSC,
plus tasks for Federal Agency for the Protection of
Children and Young People in the Media
and the Data Protection Authority (5 new FTE).
NetzDG will be fully replaced by the DSA.
In Ireland, the DSC is the Coimisiún na Meán (a new
Media Commission)
In France, ARCOM will be the "coordinateur pour les
services numériques" (CSN)
26. NEXT UP
Entered into force on Nov.22
Applies for VLOPs since 25 August 2023, applies to
everyone from 17 February 2024
In Feb. 22: User numbers published, COM then named 19
services as VLOP/VLOSE.
Risk reports were due 28/08-4/9 (4 months after
designation), but not published
Delegated Act on Audits in Q3 (September)
Template for transparency reports (by December)
Delegated Act on Art. 40, data access (planned 1st
quarter 2024)
Whistleblowing platform (similar to FISMA tool)
27. tl;dr
Platforms are not liable for the opinions of their users, but for
their own actions.
Systemic risks from the business model of surveillance advertising
and interaction-based recommendations are identified and
addressed.
BUT
Profiling for recommendation systems not restricted at all.
Prohibiting advertising profiling and manipulative practices does
not go far enough.
Systemic risks are researched but not eliminated immediately.
The DSA gives us tools, but it won't change the Internet from one
day to the next.
K.F.
- Defending and promoting digital rights for over 15 years, started with own blog, articles atNetzpolitik.org
8 years at
Since EU elections 2019 Greens/EFA
Mainly on digital issues, IMCO
Alexandra shadow rap on DSA
Work of Greens/EFA since 2019
Consultation and ideal version of law before it came out .- my content my rights
Interactive website on issues
DSA WAS ADOPTED IN JULY 2022 – IS APPLICABLE FOR VLOPS SINCE 25/8
AND GENERALLY APPLICABLE for all other sizes of services on 17/2/24, so in your country, unless you’re from Ireland
Why do we need the DSA:
Landscape of digital services has changed dramatically in the past 10 years
GAFAM
The DSA is often dubbed the EU’s law against big tech or the EU’s platform regulation –or the new EU legislation against hate online.
IT IS THE FIRST TIME in digital policy making, that the systemic risks of very big platforms are in the focus, and that there is acknowledgement that the business model is the problem – instead of just trying to tackle the symptoms or individual pieces of illegal content.
The DSA tries to tackle the business model of advertising companies Google and Meta – also jointly with the DMA – not sufficient but first step.
3. The ECD was outdated, time to adapt to new landscape.
Copy paste of liability regime
But in addition many other rules, especially for big ones
Liability rules from 2000
National judicial and administrative authorities can issue orders to notify services of the presence of illegal content
They have to meet a series of criteria. In particular, they must contain the reason why the content is illegal and information on the territorial scope of the order
Administrative Authorities: Potential for arbitrariness and abuse. In contrast to judicial orders, the police, for example, has no legal authority to interpret which content is illegal and which is not. In some cases the illegality can be obvious, but sometimes it can also limit freedom of expression (see copyright authority Hadopi in France). We fought that orders should only come from independent judicial bodies, but without a majority.
the Greens have had several successes with the terms and conditions and the transparency obligations, because:
All services must be transparent about what restrictions they have on the content on their networks, including how algorithms are used.
Great success: Services are obliged to respect the basic rights of their users - but let's see how this works in practice.
VLOPs: Transparency reports on staff - click workers in India, but also here in Europe, the working conditions are precarious: morederators receive hardly any psychological support, although they have to view and delete tens of thousands of posts, pictures and videos full of violence and hatred every day.
And they are poorly paid. DSA not the place to change that but at least Big Tech will disclose information about this in the transparency reports.
In the DSA we were able to avoid a deadline such as a 24-hour deletion period (like NEtzDG).
This would have meant that we would have created a de facto general monitoring obligation and online services might be forced to use filters in order to quickly remove content from the network and avoid liability. Especially for smaller services that would be a Pb.
Achievements: No obligation for the reporting person to identify themselves.
The reporting procedure must be close to the content (see Fb after NetzDG - the NetzDG message hidden deep in the system after several clicks)
The service must send a statement of reason to the user who uploaded the content. This also applies to all restrictions on the visibility of a post that violates the General Terms and Conditions, even if it's just a "downgrade", e.g. shadow banning.
Services must provide a complaints mechanism, but there is an exception for micro and small businesses.
Separate chapter for Online marketplaces, consumer protection orgs were pushing for this
Rec systems such as next up on YT, recommended groups on FB, content of other users on Insta etc
Article 25 on so-called Dark Patterns, which we really wanted and was very strong in the EP text:
Providers of online platforms must not design, organize or operate their online interfaces in such a way that users are deceived, manipulated or significantly impaired or hindered in their ability to make free and informed decisions.
HOWEVER: badly written, totally watered down in the last trilogue that went until 2am, enforcement becomes a big question mark. Guidance on this from the Commission, but not a priority.
EU COM Digital Fairness - Fitness test for consumer protection rules, is expected in 2024 - for the next mandate. Consumer advocates want to enforce stricter rules and a ban on dark patterns underneath.
Alexandra's main focus - putting an end to surveillance advertising. Online advertising is one of the financial foundations of the internet.
Ad tracking happens every second we're online.
Our behavior, likes, purchases, etc. are collected and evaluated. Then tailor-made advertising played out.
large platforms create comprehensive data profiles about people that allow manipulation and control of entire population groups.
This is the current business model, which works on the principle of surveillance capitalism. End result not as strong as we wanted.
The EP text was significantly better than what came out at the end of the negotiations with the Council.
But: Success is that we have something in it at all, that wasn't clear at the beginning of the DSA, because instrument gg illegal content.
Good study a few months ago from the Com, we hope for further legislation in the coming mandate
explain list.
Big green success: our amendment for a supervisory fee for VLOPs did not make it into the text of the eps, but the COM suddenly found this very good during the trilogue and therefore it has become part of the final text. There is a similar mechanism at the European Central Bank, which collects a supervisory fee from all supervised banks.
The negative impact and systemic risks that very large online platforms pose not only to our rights and freedoms but also to our society and democracy more broadly are recognized in the DSA.
In exceptional situations (e.g. Ukraine war) special measures can be taken quickly (Article 36). Kom wanted an article on crisis management for exceptional circumstances at the last second, and only for VLOPs
This was in response to the gg Sputnik and Russia Today lockdown measures shortly after the invasion of Ukraine.
We were very skeptical because of the potential for abuse: since COM cannot make an independent judicial decision for all MS - there is a risk of restrictions on freedom of expression.
Very large online platforms are playing an increasingly important role in society because they use their enormous power to influence opinions and public discourse.
Access to data is important to hold platforms accountable, to allow independent oversight and to understand how these platforms work. So in the future you can research how and if Istagram drives young girls into anorexia.
One can explore how fast disinfo spreads and why. Independent oversight to keep VLOPs accountable. BUT exceptions: if platforms do not have the required data and also if security or trade secrets are at risk.
Very good that this goes ahead now, because we are worried that the supervision will be delayed.
DPA in Germany 5 new employees, to monitor the prohibition on targeted ads.
Germany also wants to create an advisory board for the DSC.
That we have strong DSCs is particularly important with regard to the EU elections.
Alibaba AliExpress, Amazon, App Store of Apple, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and Zalando.
VLOSEs : Bing und Google Search.
Zalando and Amazon objected to the designation, meaning they put in question the whole validity of the DSA.
COM told us: priority is now child protection (Art 28) and electoral integrity (risk assessments)
Recommendation systems: sensitive data and other personal data are still allowed, i.e. in line with GDPR. But we would have liked to see more
Nevertheless: The DSA rules for VLOPs will affect attention-based ranking, which fills the pockets of the corporations with disinformation, hate and hate speech Also: we wanted a complete ban on surveillance advertising.