A framework for guaranteeing fairness in digital markets and dealing with abusive habits online is brewing in Europe, fed by an assortment of concepts and problems, from online safety and the spread of disinformation, to platform accountability, information mobility and the fair performance of digital markets.

European Commission legislators are even turning their eye to labor rights, stimulated by regional issue over unjust conditions for platform employees.

On the content side, the core concern is how to stabilize private liberty of expression online against hazards to public discourse, security and democracy from unlawful or junk material that can be deployed inexpensively, anonymously and at huge scale to contaminate genuine public dispute.

The olden conviction that the treatment for bad speech is more speech can stumble in the face of such scale. While harmful or prohibited material can be a cash spinner, outrage-driven engagement is an economic reward that typically gets neglected or modified out of this policy dispute.

Certainly the platform giants– whose organisation designs depend upon background data-mining of internet users in order to configure their behavioral and content-sorting ad-targeting (activity that, notably, remains under regulatory analysis in relation to EU information security law)– choose to frame what’s at stake as a matter of free speech, instead of bad business models.

With EU legislators opening a wide-ranging assessment about the future of digital policy, there’s an opportunity for wider viewpoints on platform power to form the next years online, and a lot more besides.

Searching for cutting-edge standards

For the previous 20 years, the EU’s legal structure for regulating digital services has actually been the e-commerce Regulation– a cornerstone law that bakes and harmonizes fundamental concepts in liabilities exemptions, greasing the groove of cross-border e-commerce.

Over the last few years, the Commission has actually supplemented this by applying pressure on huge platforms to self-regulate specific types of content, through a voluntary Standard procedure on prohibited hate speech takedowns— and another ondisinformation. The codes do not have legal bite and legislators continue to chastise platforms for not doing enough– nor being transparent enough about what they are doing.

Article curated by RJ Shara from Source. RJ Shara is a Bay Area Radio Host (Radio Jockey) who talks about the startup ecosystem – entrepreneurs, investments, policies and more on her show The Silicon Dreams. The show streams on Radio Zindagi 1170AM on Mondays from 3.30 PM to 4 PM.