Accountability Not Liability: A Recipe for Sound Platform Regulation

Sachin Dhawan, Deputy Director at The Dialogue, argues that India should preserve safe harbour protections for online platforms.

The past few weeks have been tough for online platforms. The Parliamentary Standing Committee on Home Affairs released a report contending that they fail to promptly remove illegal content, thereby jeopardizing user safety. It suggested eroding the time-tested conditional liability exemptions granted to platforms for illegal content to bring them to heel. Such a recommendation is all the more damning because it is the latest in a long line of legislative and judicial pronouncements that take a dim view of liability exemptions for platforms.

However, India should retain rather than erode conditional liability exemptions i.e. it should subject platforms to liability only if they fail to act on government/court notices to remove illegal user content. User accountability not platform liability is the change we need. This can be achieved via the imposition of appropriately calibrated procedural obligations on platforms. Such an approach will promote user safety without undermining user liberty – a delicate balance that has rarely been achieved in the history of platform regulation.

Safe Harbour Protection: Pillar of Democratic Discourse

The disenchantment with conditional liability exemptions for platforms (commonly referred to as safe harbour protection) is understandable. The volume of online harm has increased exponentially in recent times. And far too often, there seems to be no remedy for such harm. Content takedowns directed by government agencies and courts appear to be few and far between, offering little solace. They are akin to “using a teaspoon to remove water from a sinking ship” to borrow a phrase from the scholar Evelyn Douek.

But, in fact, safe harbour protection should not be done away with. The removal of safe harbour protection will put pressure on platforms to rapidly remove illegal user content. While this may seem like a desirable outcome, it will in fact be counterproductive. Such an outcome will cause platforms to harm their users’ liberties in several ways.

First, it will compel them to engage in extensive surveillance, which will undermine user privacy. Once users know that they are being watched, they will be less likely to engage in free-flowing debate and discourse. Indeed, numerous studies confirm that such constricting changes in human behaviour do take place in response to surveillance.

Second, it will give platforms even more power over their users. It will do so by compelling them to make judgement calls about the legality of every piece of content that is uploaded. At the moment, platforms don’t have to exercise such direct granular control over the content that they host. But in a world without safe harbour protection, they will be forced to assume such control.

Third, it will result in iconoclastic content being disproportionately removed. Online platforms buoyed by safe harbour protection have ushered in an era of “decentralized speech” in the words of scholar Martin Husovec. Marginalized and vulnerable groups previously shut out of mainstream discourse have thus been empowered to speak far more freely and widely online.

But without safe harbour protection, platforms will not want to be subject to liability for the controversial ideas that these groups may espouse. Ideas that may well upset a lot of people invested in maintaining the status quo. As a result, many users will find their content booted off platforms for no fault of their own.

The Way Forward: Accountability Not Liability

Thus, platforms shouldn’t be stripped of safe harbour protection. User liberties, especially those of vulnerable users, should be preserved. But this doesn’t mean that nothing can be done about online harms. Indeed, there is a solution, a way out, that builds upon liberty-preserving safe harbour protection laws rather than undermining them.

That solution involves making platforms accountable for having sound systems in place to govern content, rather than putting them on the hook for every individual piece of illegal content. It is a systemic approach to platform regulation that recognises the downsides of requiring platforms to intercept every piece of illegal content that their legions of users upload daily.

And the best part is that such a law already exists. That law is the Digital Services Act, and it recently came into force in the European Union. India would do well to learn from such a development. Thanks to the DSA, users in the EU have access to a wider range of tools to redress and even prevent the scourge of online harms than users in India. For example, access to risk assessment reports that a sizeable number of platforms have to publish for public consumption. These reports, subject to review by independent auditors, are geared towards improving the design of platforms so that harm becomes less likely to occur.

Thus, the disadvantages of doing away with safe harbour protection should give proponents of such a measure pause. India should instead cement safe harbour protection as a central tenet of its platform regulation framework. And supplement it with a suite of systemic obligations on platforms, similar to those contained in the DSA. These will serve as a much more effective bulwark against online harms than the imposition of liability on platforms for their users’ illegal content.

Sachin Dhawan is Deputy Director, The Dialogue

Share on