an attempt at a fundamental rights based proposal

Dominant internet platforms like Facebook, Amazon and Google are more and more becoming the arena of social and legal conflicts. We witness a worldwide debate about potential new rules for dominant social media platforms (a so called new “platform regulation”). These debates are highly complex in a law-based society because they require us to resolve the conflict between fundamental rights and risk delegation of essential tasks to private actors. Still, the negative effects of harmful behaviour by these actors increases political appetite for regulation.

To navigate the upcoming debate, we want to propose, collect, and evaluate concrete policy solutions within the fundamental rights framework of the European Union. These proposals have been reviewed by a group of experts from academia, civil society and selected experts. The project aims at broad acceptance of developed positions within various European civil society stakeholders. Given the complexity and novelty of the underlying problems this proposal cannot be considered the solution to all questions in this field, but instead aims to further the debate with a concrete proposal that also addresses enforcement processes. Importantly, this proposal does not tie enforcement to liability as such an approach would inherently create an incentive for over-blocking on the part of platforms.

This is a policy proposal in the form and in the spirit of a request for comments. We invite everybody to participate in the discussion, to provide feedback, and to propose amendments on any of the proposals outlined below on this website. feedback@platformregulation.eu

0 Definitions and Basic Concepts

0.1 MUST Types of Recommendations

MUST This word means that the proposal is an absolute requirement of the recommendation.

MUST NOT This phrase means that the proposal is an absolute prohibition of the recommendation.

RECOMMENDED This word means that there may exist valid reasons in particular circumstances to ignore a particular item, but the full implications must be understood and carefully weighed before choosing a different course.

NOT RECOMMENDED This phrase means that there may exist valid reasons in particular circumstances when the particular behaviour is acceptable or even useful, but the full implications should be understood and the case carefully weighed before implementing any behaviour described with this label.

DISCUSS Policy proposal that is worth discussion within the community and requires further evaluation.

RFC2119 | Key words for use in RFCs to Indicate Requirement Levels

0.2 MUST Scope Limitations

These policy recommendations and discussions are limited in their scope to democratic countries with a stable rule of law and strong fundamental rights protections.

0.3 MUST Online Platforms

By online platforms, we indicate a service that provides an intermediary function in the access to information, goods or services that are residing on the systems or networks at the direction of users.

Definition from Conseil national du numérique, Ambition numérique, Pour une politique française européenne de la transition numérique and the Digital Millennium Copyright Act.

0.4 MUST Social Media Platforms

By social media platforms, we understand those online platforms that operate with user-generated content and curate the contents of their users through algorithmic or editorial decisions.

See also: EU | Digital Single Market | Online Platforms, Even Under Kind Masters: A Proposal to Require that Dominant Platforms Accord Their Users Due Process p.8 and Online platforms and how to regulate them: an EU overview p.4

0.5 MUST Relevant Platforms

By relevant platforms, we understand online or social media platforms that have significant market power in a country within the EEA and a minimum global revenue of a certain threshold.

Examples of definitions for significant market power and how a regulator should assess them can be found in telecommunication law, e.g. Article 35 auf the Austrian Telecommunications Act 2003

0.6 MUST Dominant Platforms

By dominant platforms, we understand online or social media platforms that have significant market power in a majority of countries in the EEA and a minimum global revenue of a certain threshold.

Examples of definitions for significant market power and how a regulator should assess them can be found in telecommunication law, e.g. Article 35 auf the Austrian Telecommunications Act 2003

0.7 MUST Sponsored Content

By sponsored content, we understand all content for which a payment has been made in order to promote it, either in general or to a specific audience.

0.8 MUST Dark Content

By dark content, we understand all sponsored content that is not visibly published through the account of the advertiser and displayed only to a specified target audience.

What is the Difference Between a Sponsored Post and a Paid Ad?

0.9 MUST API Accessibility

By API accessible, we understand a computer information system that gives access to content via a unique identifier. This requires that data has to be downloadable in bulk, by day, week, year and per country. New data shall be accessible via the system within a day of being published. APIs should be designed in a way to sustain independent research and long-term studies.

Facebook and Google: This is What an Effective Ad Archive API Looks Like Facebook’s Ad Archive API is Inadequate

0.10 MUST Content Provider

By content provider, we understand the person or entity that has published or created the post with the content in question.

0.11 MUST Political Accounts

By political accounts, we indicate those accounts run by, or acting on behalf of, political parties, associations affiliated with political parties, or politically exposed persons as defined by Article 2 of EU Directive 2006/70/EC.

EU Definition PEP

1 Content Regulation

1.1 MUST Procedural Safeguards for Content Notifications

A central pitfall of the current notification and action regime is the lack of procedural safeguards for the notification procedure. Every online platform needs to present to the user easily accessible, user-friendly and contextual notification options. These options should be available without the obligation to sign-in or sign-up with the service itself, if the content in question is publicly available.

  1. Notifications should offer categories of different types of violations, ranging from various classes of illegal content to legal content that might be in breach of the Terms of Service or other rules of the platform. Different notification categories should trigger different procedures, which take into account the fundamental rights of all parties in question, meaning that procedures with stricter safeguards cannot be substituted by procedures with less strict ones. For example, a notification of illegality with the possibility of legal redress cannot be circumvented by deletion of the content in question under the Terms of Service of the platform.

  2. A valid notification should be sufficiently precise and adequately substantiated. This should include 1) the location of the content (URL); 2) the reason for the complaint (potentially including legal basis under which the content has to be assessed); 3) evidence of the claim and potentially legal standing; 4) a declaration of good faith that the information provided is accurate 5) considerations on limitations, exceptions, and defences available to the content provider. Only in notifications of violations of personality rights or intellectual property rights is the identification information of the notifier mandatory. In all other cases, identification and contact information of the notifier are optional.

  3. For purposes of procedural fairness and increasing the quality of content moderation, the content provider should be informed about a notification of his or her content, the reason for the notification, information about the subsequent process and possible ways to appeal or file a counter-notifications. The content provider should be informed immediately once the platform has received the notification and not just after a decision has been taken. Exceptions from this obligation to notify the content provider might apply only if sending notifications would hamper ongoing law enforcement investigations.

  4. Possibility for counter-notification should be offered to the content provider to respond to the claim of the original notifier with evidence and arguments to the contrary. This counter-notification should be an option even before a decision by the platform is taken. Both original notification and counter-notification should apply the same standards in terms of declarations of good faith. The counter-notification can also be filled after the content has already been removed and can also challenge the category of the content in question.

  5. Online Platforms have to inform the parties involved in notification about the outcome of the decision a platform has taken in their case. This communication is always sent to content providers and to notifiers if they have provided contact details in their notification. This communication needs to include 1) the reasoning of the platform for why it came to this decision; 2) the circumstances via which the decision was made, and if the decision was made by a human or an automated decision agent; and 3) information about the possibility to appeal this decision by either party with the platform, courts or other entities. This communication should also be sent for counter-notifications.

  6. Online platforms need to publish information about their procedures and time frames for intervention by interested parties. This information should include 1) time before a notification is sent to the content provider; 2) the time for the content provider to respond with a counter-notification; 3) the average and maximum time for a decision by the platform for categories of cases; 4) the time at which the platform will inform both parties about the result of the procedure; 5) the time for different forms of appeal against the decision.

These proposals derive strongly from Kuczerawy, Aleksandra, Safeguards for Freedom of Expression in the Era of Online Gatekeeping (September 11, 2018). Auteurs & Media, 2018, Forthcoming. which builds on top of the Manila Principles.

1.7 DISCUSS Trusted Flaggers

Dominant and relevant social media platforms may appoint trusted flaggers within a country. Notifications of trusted flaggers are dealt with more expeditiously than others, but they are subject to the same safeguards as regulator notifications. A list of all current and previous trusted flaggers has to be published by the platform. The application and revocation process, as well as criteria for an organisation or an individual to be awarded trusted flagger status must be made public. Governmental institutions should never be able to become trusted flaggers.

1.8 MUST Establish Registered Offices to Interact with Authorities

Dominant platforms have to establish registered offices in the EU countries where they conduct their business. These offices allow local law enforcement and courts to reach the platform under their jurisdiction.

Wikipedia: Ladungsfähige Adresse and registered office

1.9 MUST NOT Real Name Policy

For many marginalised groups anonymity is a pre-condition for the excise of the right to freedom of speech. Therefore, the idea to foster effective law enforcement by obliging all account holders on social media platforms to register with their real identity would lead to a chilling effect.

Arguments against a real name policy proposal in Austria Alternatives to real name policies

2 Algorithmic Accountability and Disinformation

2.1 MUST Empower Users to Take Control Over Algorithmic Curation of Information

Users must have an easily accessible option to sort the content being displayed to them Dominant and relevant social media platforms have to offer this possibility to users. At the least, the setting should incorporate a fully chronological timeline, but would benefit from including also other factors that empower the user to take control of their information diet. Users can take these decisions actively over the duration of individual sessions. The concrete options the platform must offer can be evaluated by the regulator, which can issue guidance on potential additions and the design of the feature. This obligation does not exclude the potential insertion of sponsored content.

In allowing users to see how much content is otherwise hidden from them, this measure improves the user’s understanding of algorithms. It also enables them to understand the amount of content posted by accounts they follow. Technically this option should not create an undue burden for the platform provider.

2.3 DISCUSS Scientific Access to Dominant Platforms via Committee Safeguard

Establishment of an EU committee which receives and decides on research applications from independent academic institutions that offer a benefit to society. The approval is dependent on ethical data protection and scientific standards of the research proposal. Once approved, the dominant social media platform has to grant access to the defined data sets. An oversight board will enforce the compliance of researchers and the platform with the agreed data protection and research standards. Data provided by the dominant platform needs to be consistent and in a standardized machine-readable format.

Social Science One: positive attempt that got stuck because of failed sanctions for refusal of dominant platforms to cooperate.

2.4 DISCUSS Scientific Access to Dominant Platforms with Differential Privacy Safeguards

Dominant platforms should provide access to their data via a differential privacy interface to the researchers selected by a committee. To protect private user data, the differential privacy measure introduces statistical noise into the output of every query.

Different systems of randomization of user data could be bypassed by requesting multiple sets of data to reverse engineer the randomization process. The risk of large amounts of personal data being published is thereby higher than the benefits possibly gained by publically accessible research data.

Possible options could be providing quarterly sets of randomized data for public research, which is only once randomized and then published as such.

Nahles, Daten für alle Gesetzesvorschlag

2.5 MUST Transparency Reports

Proportionate transparency obligations have to empower users to adequately assess the trustworthiness of platforms. Reporting obligations have to be fulfilled with a proportionate regularity and in an openly licensed, easily understandable and machine-readable format. Platforms are required to publish such reports proportionate to their size, market share, and the potential risk for users. Transparency reports need to be published on the following topics:

  1. Report on law-enforcement information requests for user data containing, at least, on the total number of requests for user data that were fully complied with and the total number of accounts affected, the sensitivity of the data requested, the total number of fully complied with requests and, listed separately, the total number of requests with which the platform has not or only partially complied. This data must be provided per country, per legal basis for the request and, if different security authorities are involved, also per authority.
  2. Report on legal requests on content and account blocking containing, at least, the total global number of notifications of illegal content, the total number of accounts affected by the request, the total number of requests complied with, with the total number of requests complied with partially or not at all listed separately. The total number of requests for blocking accounts, the total number of accounts affected thereby, and the type of legal demands requiring for content to be blocked or deleted should also be included. This data must be provided per country, per legal basis and, if different security authorities are involved, also per authority.
  3. Report on the enforcement of Terms of Services containing, at least, the total number of account blockings or suspensions and content deletion in the categories of violations. The reporting needs to include the average time elapsed between publication of the content, notification, potential counter-notification, and action. This reporting will be categorized by the different sections of the Terms of Service, the actions that were taken, and if the decisions were partially or fully automated. Dominant platforms need to lay out how the enforcement of the Terms of Services is implemented and overseen. The report should also highlight all cases where the outcome of a content moderation decision based on Terms of Services contradicted the outcome of a notification of illegality.
  4. Public authorities should make available publicly and in regular manner comprehensive information on the number, nature and legal basis of content restriction requests sent to intermediaries and on the actions taken as a result of those requests. Further, the information should include content restrictions based on international MLAs. States should publish detailed transparency reports on all content-related requests issued to intermediaries.

The Santa Clara Principles

Current Twitter transparency on legal information request

Current Google transparency on legal information requests

Current Facebook transparency on legal information requests

Current Microsoft transparency on legal information requests

Current Apple transparency on legal information requests

Current Facebook transparency on legal content restrictions

Current Twitter transparency on legal content removal

Current Twitter transparency on Terms of Service enforcement

Current Facebook transparency on community standards enforcement

2.6 MUST Rectification of Behavioural Profiles

Users must be enabled to rectify and edit their personal advertisement profile. The user can have information changed that has arisen from algorithms due to incorrect data, as well as information that an algorithm has incorrectly composed from correct information, without the necessity to prove the truthfulness of the request. The user-interface of the platform needs to display the option for rectification close to every targeted advertisement that is based on profiling. The user-interface also has to display the criteria via which the user was targeted with this particular advertisement.

Aspects of this proposal are already covered by the Right to rectification | Art. 16 GDPR. However, algorithmic assumptions about a person that cannot be disputed with facts are not covered by the right to rectification. Current Facebook Ad Preference Screen

2.7 MUST Advertisement Archive

Dominant platforms must make an archive of sponsored content available if the content was either displayed within the European Union or paid for by an account registered within the European Union. This archive must contain all sponsored content displayed within the last several years, with full functionality, as they were displayed to the user. The additional information stored in this archive must be also provided in a machine-readable format and accessible by an API. Additional information that needs to be supplied within the archive includes: whether the sponsored content is currently active or inactive; the start date for active content and the timespan in which the sponsored content was active for inactive content; the name and contact details of the advertiser; the total number of impressions; the exact description of the target group; the exact amount of money paid and, while active, the estimated amount. For sponsored content that needs to be depublished due to Terms of Service violations or legal proceedings, the additional information needs to stay in the ad-archive and further information about the type of rule violation or pending lawsuits needs to be provided. Each piece of sponsored content must contain an attached info button that directly links to the content within the Advertisement Archive.

This provides more transparency about commercial advertisement in general and, by building awareness, this may also have a positive impact on public manipulation in general. In addition to algorithmic transparency, the possibility to understand the reason that an advertisement is shown to you may also be an important step in understanding why people see what they see online (algorithmic content composition). For more information on the current Ad Libraries see Description on “Political Advertisement Archive”. About Facebook Ads

2.8 MUST Political Advertisement Archive

All political sponsored content needs to be centrally visible in a public advertisement archive. This archive must store all political sponsored content for several legislative terms. This archive must contain all sponsored content displayed within the last several years, with their full functionality as displayed to the user. The additional information stored in this archive must be also provided in a machine-readable format and accessible by an API. Additional information that needs to be supplied within the archive includes: whether the sponsored content is currently active or inactive; the start date for active content and the timespan in which the sponsored content was active for inactive content; the name and contact details of the advertiser; the total number of impressions; the exact description of the target group; the exact amount of money paid and, while active, the estimated amount. According to a follow-the-money approach, intermediaries have to list the ultimate client or beneficiary of the sponsored content. (Political sponsored content must be distinguishable from common sponsored content. To differentiate political sponsored content from common content, political accounts need to register with the platform and subsequently be distinguishable from common accounts.) To increase accountability of political actors, politically sponsored content needs to contain a link referencing this content in the political advertisement archive. For sponsored content that needs to be depublished due to Terms of Service violations or legal proceedings, the additional information needs to stay in the ad-archive and further information about the type of rule violation or pending lawsuits needs to be provided. Each piece of sponsored content must contain an attached info button that directly links to the content within the Advertisement Archive.

Political Accounts: Facebook on ads related to politics or issues of national importance Facebook authorization process for political accounts Facebook getting started for political accounts

Political ads are not restricted to political parties, leaders, or foundations. Therefore, it is important to create a general advertisement archive. See: Advertisement Archive.

It is important to list the ultimate beneficiary because political propaganda may be also be spread by dummy accounts to bypass regulation concerning political advertisement.

Providing the exact amount of money spent on a political online advertisement is also a requirement to effectively monitor political campaign regulation. EU country comparison on political campaign regulation p. 12

Current Facebook political Ad Library

Current Twitter political Ad Library

Current Google general political Ad Library

Current Google EU political Ad Library

Bing banning the political advertisement

Instagram currently only provides the option to Report political ads within the US and display such ads within Facebooks Ad Library

What is Facebook doing to secure elections

2.9 MUST Prohibition of Dark Content

Dark content provided by political accounts is generally prohibited. All content that is published by a public political account needs to be visible on the account page.

Personalised election promises are fostering misinformation and weaken democratic discourse. Every political message needs to be accountable and subject to public scrutiny.

3 Interoperability and Competition

3.1 MUST Interoperability Obligation

The platform regulator has within its mandate the power to order, on a case-by-case basis, the provision of data transfer and service inter-operability measures. Such measures can only be ordered from dominant platforms.

  1. Criteria for the evaluation are the technical feasibility, increase in consumer choice and competition and innovation to the benefit of smaller market participants. Orders of data transfer and inter-operability measures shall not lead to a risk increase for user privacy and security.
  2. The regulator should follow a co-regulatory approach on the detailed standardisation of APIs and the semantic markup of the data in question. Regular consultations should be held with other EU institutions like ENISA, EDPS, industry, and interested parties to identify potentials needs for regulatory intervention.
  3. Data transfers must entail a semantic markup of the data to be transfered. API accessibility shall, where technically feasible, be built upon decentralised technologies (OAuth) instead of intermediary data portability platforms.

See Data transfer project, Data transfer project whitepaper, Right to data portability | Art. 20 GDPR, Economics of open and closed systems - switching costs p. 8 and Tim Berners-Lee | Solid | true data ownership

3.4 DISCUSS Effective Assessment of Market Power in Digital Markets

The criteria upon which market power is assessed should include proxies, such as the control of data necessary for the creation and provisions of services. Abuses of competition power often also entail other breaches, such as consumer law or privacy protections. Close cooperation between competent authorities is a key requirement for effective enforcement.