January 31, 2021The Intersectioninformation ageeconomics

What we must regulate when we regulate social media platforms

Public policy should seek to prevent the concentration of narrative rather than market power in social media companies

Public policy should seek to prevent the concentration of narrative rather than market power in social media companies

Mint This is from The Intersection column that appears every other Monday in Mint.

The global debate over how to govern Big Tech has intensified after Twitter, Facebook, Alphabet and Amazon de-platformed former US President Donald Trump and some of his supporters in the wake of the mob raid on the US Capitol on 6 January. Clearly, transnational technology platforms not only influence politics and markets through actions of users they don’t control, but directly wield political power themselves. Human society has yet to completely adjust to these new power centres of the Information Age, and all states—from autocracies to liberal democracies—are in their own ways contending with the challenges of how to limit, constrain, regulate and harness them.

The challenge for liberal democracies is particularly acute because Big Tech platforms affect cognition. By controlling the content and flow of information, they—and their users—can shape what people think, believe and do. Many business models rely on tripping the instinctive fast brain, causing people to jump from issue to issue, and reducing their ability to pause, reflect and apply their reasoning mind. The edifice of liberal democracy—and free markets—relies on the belief that humans are capable of reason, and make rational decisions after weighing pros and cons. What we are increasingly discovering is that we are not quite the rational decision-makers we believed we are, and are far more likely to be swayed by what other people think’. This is true for personal decisions, as also for political and economic ones. And this loophole in human cognition is being exploited by political parties, religious organizations, firms and, of course, by Big Tech. It also raises a philosophical question: If people do not make rational decisions, what good are liberal democracy and free markets?

Liberal democracies are responding to this challenge in two distinct ways, both relying on using existing instruments to fix a problem that they were not designed for. One approach is to regulate various aspects of the information domain, primarily through surveillance, restrictions on free speech and protection of rights. Yet, empowering governments to increase surveillance and restrict free speech inevitably leads to politicization, partisanship and injustice. Moreover, if you disagree with Facebook’s rules, you can choose not to use it. But if you disagree with your government’s rules, you can’t opt out. The other approach is to invoke competition law and try to use anti-trust regulation to curb the market power of Big Tech platforms. Apart from the fact that barriers to entry on the internet are almost non-existent, targeting the market power of technology platforms fundamentally destroys value for everyone, including consumers. One big network is many times more valuable than the sum of four small networks each quarter its size. Breaking up Facebook Inc., for instance, might reduce the political power of the company and its owners, and perhaps temporarily slow down the spread of hate and bigotry. It is unclear if these benefits can justify the destruction of global public value that breaking up such a company entails.

There is possibly a better approach. The political power of technology platforms comes from their narrative power” more than their market power; from mindshare more than market share. It is this power that liberal democracies must check and will be justified in targeting. Facebook, Twitter and YouTube are considered platforms’ because they do not exercise editorial control over what is published. Yet, they do control what users see, albeit employing algorithms that they exclusively control. This is the source of their narrative power, and this is what public policy must focus on.

On way to do this, as my colleague Mihir Mahajan proposes, is to require technology platforms to enable algorithmic competition. For instance, users on social media platforms should be able to choose how they want to order their feeds. Facebook gives you no choice today. Twitter allows you an unfiltered reverse chronological feed or one generated by its proprietary algorithm. Mandating algorithmic competition would require them to open up their platforms to third-party algorithm providers, who are, in turn, subject to transparency and disclosure requirements. All platforms that cross a certain threshold of active users, could be mandated by law to open up to such competition.

There are many more The Intersection columns here

In a similar vein, Francis Fukuyama, Barak Richman and Ashish Goel call for a middleware solution: A competitive layer of new companies with transparent algorithms would step in and take over the editorial gateway functions currently filled by dominant technology platforms whose algorithms are opaque.” Another colleague of mine, Saurabh Chandra believes that a simpler approach would be for making it mandatory for platforms to open their application programming interfaces (APIs) for access by third-party clients, which could then offer competing filtration algorithms. The approaches differ in complexity and detail, but what is common to them is a clear understanding that the problem is political, not economic, and the way to tackle it is to use competition to limit narrative power, not market power. They offer us grounds to expect that we can use liberal democratic methods to safeguard liberal democracy in the Information Age.



If you would like to share or comment on this, please discuss it on my GitHub Previous
A republic on a pedestal vs a republic in daily practice
Next
Heroic privatisation targets in this year’s Union Budget

© Copyright 2003-2024. Nitin Pai. All Rights Reserved.