For those of us interested in the future of the internet, all eyes are on Brussels where legislation – the Digital Services Act and the Digital Markets Act - is being shaped that aims to rein in the over-reaching power of online platforms. Their power is not only economic, it also impacts democracy – platforms influence access to information and opinion-making. That’s why legislation which tackles unfair and arbitrary practices by platforms will have a beneficial impact on our democratic future.
One damaging and concerning practice that needs to be curbed is interference by online platforms with media content. This is what happens: media content - news, entertainment, education and more – is subject to well-established and strict laws and editorial standards. But when media, including our Members, put their content on platforms (video sharing platforms, social networks, app stores etc) it regularly gets interfered with by those platforms.
This takes many forms – and it is very often arbitrary. It ranges from blocking entire apps or accounts (for example the Instagram account of Swedish Radio was removed without warning or explanation), to the removal of individual pieces of content and limiting access to specific content (as when a France Télévisions youth programme with information on contraception was prohibited for under 18s by Snapchat). There are more examples of platforms restricting the visibility of media organizations’ accounts or insisting on seeing editorial scripts before content can be uploaded.
These outcomes are the result of platforms arbitrarily applying their own unilaterally set terms and conditions to media content that already adheres to rigorous standards. Today it is solely up to the platforms, as private, commercial companies, to set, apply and enforce terms and conditions to media service offers and applications. As European Commission Vice-President Věra Jourovà said at a conference this week “there are no rules on what these conditions should be and no effective democratic oversight.”
The Digital Services Act (DSA), which will set basic standards for platforms, can – and must - address this. By including rules in the DSA, this damaging platform behaviour can be ended. Such rules would set clear limits to how platforms’ terms and conditions can be applied to content and services provided by professional media organizations.
This issue should not be conflated with the fight against disinformation. Public service media invest heavily in combatting disinformation. Providing audiences with reliable, balanced and accurate news and information is at the heart of what we do. Our newsrooms develop and partner on many fact-checking activities to dismantle fake news. Producing large volumes of high quality, trusted news that is easy to access and find across all platforms is the best way to combat disinformation.
Global platforms also need to step up their own efforts on disinformation including setting the bar higher on their 2018 Code of Practice on Disinformation which is generally considered to have failed to deliver significant outcomes.
Neither should the issue be seen as some kind of immunity or exemption for media. Broadcast media are already heavily regulated, both at European and national level. They have complaints mechanisms in place and are overseen by regulators. The safeguards we advocate for would leave these rules and standards intact. It would actually reinforce their application in relation to the distribution of content online. If online platforms do what they want with professionally produced media content, existing national and EU standards would in fact be undermined and their effectiveness seriously hampered.
I’m aware of the challenge for policy makers and I thank them for their work in skillfully piloting the DSA through its course. The Act can bring a future where we have more online innovation and choice but its success will equally depend on which safeguards it provides against unfair and arbitrary platform behaviour. Let’s seize this decisive moment.