Platforms: From Revolution to Regulation

As platforms set up control points and competitive bottlenecks to establish dominance in their ecosystem, they often act against the interests of other actors in their ecosystem. This leads to several scenarios where ecosystem participants, and society and markets at large, may get adversely affected because of the platform?s dominance. Effective regulatory mechanisms must target these control points.
First, several B2C platforms leverage their control over market access or data to directly compete with other players in the ecosystem. Google has been extensively investigated and fined for unfair practices where it pushed down rivals in Google search results, while several researchers have demonstrated how Amazon uses its oversight over market-wide data on its platform to selectively compete with merchants on the platform.
Second, platforms like Facebook and Google have been under increasing scrutiny on their data usage practices. Facebook’s news feed and Google’s search service are important control points that enable these platforms to constantly harvest data from users, often more data than is required for the improvement of their services. This data, in turn, serves as a control point over brands and advertisers looking to influence these users.
Third, platforms may start as open systems but increasingly exert greater control over the ecosystem, without prior warning to ecosystem members and to their detriment. Google has been criticised for creating an increasing number of control points over Android, while platforms like Twitter and Uber have often changed policies to the detriment of actors in their ecosystems.
Some platforms also engage in unfair and differential treatment of their ecosystem partners. For instance, Amazon and Flipkart have been charged by the Competition Commission of India (CCI) with preferential treatment to certain sellers, in exchange for bilateral arrangements that favor the platform. Both platforms have been accused of offering deep discounts and preferential listings to certain sellers, to the detriment of other sellers.
Another aspect of fairness deals with the inability to audit algorithms used by platforms. Since platform algorithms determine market access and consequently market power for ecosystem participants, the black-boxing of such algorithms prevents regulatory scrutiny.
Platforms may also commoditize ecosystem players by intensifying competition among them. For instance, Amazon’s Buy Box is a highly contested battleground, which increases competition among merchants, increasingly commodifying them and impacting their margins. When multiple sellers offer the same product, Amazon’s algorithms determine the seller who should be featured on the Buy Box. This is very important because more than 80-90% of Amazon?s sales are via its Buy Box. While the Buy Box reduces search costs for consumers and helps create a more efficient market, smaller merchants joining are negatively impacted by this intensified competition.
Finally, as more services and work move into the platform economy, platforms also use their dominant power to exploit workers and service providers in their ecosystem. 
 
Moving Fast Breaks More Things 
 
Silicon Valley?s ?Move Fast and Break Things? refrain has led to digital platforms making their own rules.  The technology industry usually leaves the quality and damage control of minimum viable products to the end user. While this worked at a small scale previously (even for Google and Facebook), it led to unforeseen problems as these platforms started scaling economic influence. Seemingly innocent experiments – A/B testing – on peer-to-peer lending platforms, for instance, create market fairness issues. False positives spewed out by facial recognition algorithms are proof of what happens when new technology is released without any safeguards. Racial and gender bias has been prominent in facial recognition software and AI, including those used by law enforcement. Just like how medicines and chemicals are required to adhere to safety standards, tech platforms too should demonstrate 
compliance with fit-for-market standards, before entering the market. Platforms should be held financially accountable for any damage done due to their products. 
 
No liability, no incentive?
 
Platforms have less liability and no incentive to reduce distribution of harmful content as long as such distribution serves to scale their business models. Facebook, Instagram, YouTube, and Twitter make money out of advertising which relies on user attention. Platforms use algorithms to push controversial content like hate speech, disinformation, conspiracy theories, as they draw attention and amplify user engagement. Recommendation engines put out potentially harmful content as long as there is no economic incentive for them to stop doing so. In the US, platforms enjoy protection from liability due to Section 230 of the Communications Decency Act 1996, which provides platforms immunity for any harm caused by third party content.
 
Regulation of data privacy
 
Prof. Shoshana Zuboff from Harvard says that personal data should be treated like body organs and as a human right rather than as an asset that can be bought or sold. According to Zuboff, no corporation should be allowed to use private data to influence user choices. 
Of course, blanket regulation of data would come rife with issues. Some regulations like Europe?s General Data Protection Regulation (GDPR) and the California Computer Privacy Act (CCPA) urge consumers to opt out of data usage by companies. But this is problematic as consumers are usually unaware of how their data are exploited by corporations. 
Moreover, much of value creation in the digital economy relies on effective use of data to improve market interactions and consumer decisions. Over-regulation aimed at ensuring privacy runs the risk of imposing too many controls over data acquisition by platforms, which could in turn directly impact the platform?s ability to enable efficient markets.
Regulators will need robust frameworks explaining uses of data that benefit consumers and markets and those that are purely exploitative and aid value concentration towards a single actor, typically the platform owner. Ideally, regulation would deal with data ownership and usage rights in a manner that not only enabled the platform to create an efficient market but also protected the ecosystem actors and assisted the bodies that represent them.
 
 
Whither regulation? – Various schools of thought
 
Should platforms be regulated? Various schools of thought have emerged over the past several years. 
 
1. Platform bans
 
Several jurisdictions have taken an extreme approach of entirely banning platforms which do not comply with existing regulations. This is unlikely to prove to be a nuanced or sustainable solution. Bans are often championed by lobbying incumbents, who seek to protect a traditional advantage, and these bans run the risk of disincentivizing innovation entirely. Moreover, the imposition of bans is far from uniform, and this inconsistency tends to produce a fragmented regulatory landscape which can impede concerted and consistent regulatory action against the platform. More importantly, a fragmented regulatory landscape also has larger systemic effects, such as the migration of technology firms to jurisdictions with lighter regulation, with long-term repercussions for cities and countries imposing the ban.
 
2. No regulation is good regulation
 
Another response, at the opposite end of the regulatory spectrum, is the complete absence of regulation. Some scholars argue that traditional regulation, when applied to platforms, will lead to over-regulation, thereby curtailing all benefits that labor platforms create. Some proponents of eliminating regulation go so far as to suggest that, because the interests of the platform are intrinsically aligned with those of the ecosystem, platforms will naturally be motivated to invest in ecosystem protection. However, as already demonstrated across jurisdictions, ?no regulation? is unlikely to be a practical approach that is widely adopted.
 
3. Self-regulation
 
A third related argument champions self-regulation by the platform. Self-regulation is frequently proposed as a feasibly implemented solution due to the information asymmetry that exists between the platform and other stakeholders, including the traditional regulator. The argument for self-regulation rests on two key pillars: first, 
that reputation systems are effective in guaranteeing market efficiency, and, second, that market efficiency is aligned with positive outcomes for all platform stakeholders. However, both these arguments are contentious. Reputation systems can be manipulated and biased. While self-regulation may work to the extent that it creates an efficient market, it is unlikely to be successful as a means to empower all actors when their interests are at odds with the interests of the platform owner. Though flawed, the argument for self-regulation throws a welcome but harsh light on the need for regulation to expand visibility into the opaque data and obscure workings of platforms. An independent regulator is required to ensure fair competition among platforms; delegating regulatory responsibility to the platform owner because of their exclusive access to this data is not a solution.
 
4. Using narrative to sidestep regulation
 
Platform promoters use the phrases ?sharing economy? and ?collaborative consumption?, which conjure a positive image of platforms in general, and on-demand platforms in particular. These narratives present the platform as an intermediary that facilitates efficient, sustainable and decentralized markets in a manner that needs no regulation. However, the concept of sharing can be cynically obfuscated. Platforms such as Couchsurfing, which started as not-for-profit intermediaries, enabling sharing among participants, have moved on to create for-profit businesses, focused on maximizing shareholder value, sometimes to the detriment of existing stakeholders. These decentralized production systems encourage a culture of sharing but are answerable to centralized governance and funding; the sharing economy narrative of altruism and socialism is secondary to the platform?s profit- seeking behavior. While for-profit platforms may also encourage a culture of sharing, the eventual centralization of profits and maximization of shareholder value are at odds with the overall narrative. More specifically, these platforms may improve market access and generate additional surplus but this does not imply that such surplus is equitably distributed among all stakeholders. Any regulatory framework should ensure that these narratives do not function as a ploy to sidestep regulation while maintaining control, information asymmetry, and profit centralization that could lead to ecosystem exploitation.
 
 
Crafting regulatory responses
 
Regulators need to work towards the creation of appropriate ecosystem protection and empowerment, while ensuring that such regulation is applied not at the point of market entry but subsequent to it, using actual data from platform usage.
Data plays an important role in creating value and establishing power dynamics on the platform. Data enables the creation of efficient markets. Both consumer and producer behavior can be influenced using data. Data also creates enforced dependency for various platform users. Finally, the platform?s exclusive ownership of data also creates greater information asymmetry between the platform and all other stakeholders.
An expandable and effective regulatory framework for platforms must be centred around the regulation of data. To that end, the regulatory framework should involve four key components:
 
1. Decreasing information asymmetry between platform and ecosystem actors
 
Several patterns of ecosystem exploitation on platforms can be traced back to the information asymmetry that exists between the platform and its ecosystem. Decreasing information asymmetry would increase the bargaining power of the ecosystem.
 
2. Reducing ecosystem dependency driven by proprietary data that locks-in users
 
If a reputation data for producers is locked to a specific platform, it prevents them from moving to other platforms and further reduces their bargaining power.
 
3. Regulating through open data
 
The exclusive ownership of data by the platform also serves to obstruct effective regulation. Lacking visibility into actual behaviors on the platform, regulators resort to traditional regulation, which can often impede innovation without increasing ecosystem empowerment. At its most extreme, regulators may choose to ban a platform outright. Instead, platforms should cooperate with regulators by facilitating external access to their data. The incentive to do so would be much lower regulation upfront. Access to data would be heavily curated to alleviate concerns that third parties could gain insight into a platform?s carefully nurtured competitive strengths. Regulators and platform owners would therefore need to work together to identify data that offer an understanding of relevant market behavior without reducing the platform?s competitiveness.
 
4. Enabling alternate regulatory structures on the data
 
Even as platforms agree to provide access to their data, regulators must set up more agile and decentralized regulatory structures. With data access, the regulatory structure itself could work like a multi-sided platform. Platform users would act as producers of data. These data could be consumed through API access by third parties. This would allow regulators to set up overall regulatory guidelines and empower third parties such as research agencies to analyze the data and propose regulatory interventions based on actual market behavior. This would also allow regulation to expand at the rate of innovation. Just as platforms exploit decentralized value creation, so this form of co-developed regulation would allow regulators to exploit decentralized regulation, keeping pace with innovation on the platform.
 
The unintended consequences of regulation
 
In 2020, it?s now fairly clear that platforms have failed at self-regulation. Thoughtful and comprehensive regulations are required to save democracy, public health, privacy, and competition in the economy. But regulations come with consequences. 
Critics have warned of harder detection of online child sexual abuse due to privacy protections that are meant to be executed towards the end of 2020 in the European Union. As per the new regulations, big tech firms like Facebook and Microsoft will be forbidden from using automatic detection tools used to detect online grooming or images of child abuse. Critics of such tools contend that automatic scanning infringes on people?s privacy, especially those using chat and messaging apps. Those opposing the European Electronic Communications Code directive worry that banning detection tools in Europe could lead to firms halting their usage elsewhere in the world. ?If a company in the EU stops using this technology overnight, they would stop using it all over the world,? said Emilio Puccio, coordinator of the European Parliament Intergroup on Children?s Rights.
As regulation speeds up in 2021, regulators will need to ensure that regulation effectively safeguards ecosystem interests without stifling data-driven innovation. Regulators will also need to coordinate across other actors and advocates to understand potential unintended consequences and craft regulations that mitigate them.
 
 
Share on

Leave a Reply

Your email address will not be published. Required fields are marked *