Tomorrow, Rep. Khanna will be among the members of Congress questioning OMB Director Mick Mulvaney in a public hearing on the President's Fiscal Year 2018 Budget. Watch online here beginning at 9:30 am Eastern on May 24.
Congressman Khanna represents the 17th District of California, which covers communities in Silicon Valley. Visit his website at khanna.house.gov. Follow him on Facebook and Twitter @RepRoKhanna.
The past few years have seen increasing demands for platforms to collaborate in fending off certain perceived threats created by online speech. These don't resemble traditional cartels: They are not hidden but touted, and they are widely seen as beneficial or even necessary. They are also fragmented; rather than a single cartel-like agreement between set members, these arrangements are taking place in different ways in different spheres. But from small beginnings, content cartels are enveloping ever more difficult and contentious areas of online discourse.
This is a pattern that will repeat in the story that follows. First, there is the identification of a threat caused by online speech that can be more effectively tackled through coordination. Initial concerns are voiced about the human rights implications of such centralized censorship measures. But ultimately those concerns give way to the view that the seriousness of the threat justifies such measures, provided that the domain is kept carefully circumscribed.
Again, however, the dynamic was the same: an unlikely collaboration between highly competitive firms, at first resisted and carefully circumscribed but ultimately embraced and expanded, in a form that lessens public pressure without increasing public accountability.
Content cartel creep is now spreading to the fight against foreign influence campaigns. This is an area of increased focus and activity, both from perpetrators and defenders alike, in a manner reminiscent of the increased concern about terrorist content a few years before the establishment of the GIFCT.
The Stanford report highlights an essential caveat that is absent from current collaborations: Tighter coordination needs to have third-party oversight to be credible and legitimate.35 35. See Stamos et al., supra note 32, at 49. Without oversight, there is little to ensure that individual rights will be adequately respected. Such oversight is often part of the original vision of these collaborations, at the point of the conversation when free speech concerns are given their fullest hearing. But, as discussed further below, so far this has not been realized in practice.
Say you start a company and all of a sudden the Nazis take over ... there are very few people you can call to come help you with your Nazi problem, or with your child safety problem or your suicide problem ... you can't expect that every company builds all these things from scratch ... . You could see Facebook turning their experience scaling trust and safety into an actual service they provide to all of the smaller companies who can't afford to hire thousands of people themselves.4646. Vergecast: Is Facebook Ready for 2020?, The Verge (Aug. 27, 2019), -alex-stamos-interview-election-2020-vergecast [ -NQ8T].
All such proposals are a long way from the carefully circumscribed CSAM databases. But if you build it, they will come. And often, once there is a decision that certain content cannot be left to the marketplace of ideas, the decision quickly follows that platforms should cooperate to ensure consistent enforcement.
What explains the phenomenon of content cartel creep? There are a number of forces at work that serve as rationalizations for the need for platforms to cooperate more and across a greater variety of contexts.
Cross-platform coordination addresses other problems too. For example, in the cases of extremist content, smaller platforms see an increase in such content on their sites as major platforms crack down.77 77. See, e.g., Siddharth Venkataramakrishnan, Far-Right Extremists Flock to Protest Messaging App Telegram, Fin. Times (Dec. 16, 2019), -1c35-11ea-97df-cc63de1d73f4 [ -RB4B]; Katz, supra note 62. So without collaboration, these problems are not solved, only moved. Smaller platforms that do try to make changes can also find their efforts stymied by the lack of coordination by more mainstream sites. For example, as Pinterest took action against anti-vaccine misinformation, it found its efforts partially frustrated by the failure of other platforms to block such content, allowing users on Pinterest to simply link to it.78 78. Julia Carrie Wong, Pinterest Makes Aggressive New Move in Fight Against Vaccine Misinformation, The Guardian (Aug. 28, 2019), -anti-vaccine-combat-health-misinformation [ -32JS].
It is important to note that forces toward the cartelization of content decisions remain whether or not antitrust action is taken against the major social media platforms to break up individual companies, and whether or not smaller platforms come to displace the current monoliths that dominate the public sphere. These questions are not going away. Content cartels are fundamentally a response to the growing consensus that there are certain areas that need to be placed beyond competition, both in the economic marketplace and the marketplace of ideas. Civil society, users, and lawmakers are demanding more comprehensive responses from social media platforms, and platforms are seeing fewer upsides to resisting these calls in certain areas. Whatever regulatory action is taken with respect to the current tech giants to preserve the vitality of the public sphere, it needs to be done in a way that ensures that the problems of cartelization are not exacerbated.
One way to read the story of content cartels is as a tale of progress. For some harms created by online speech, collaboration between tech platforms can significantly limit the damage, which is why such collaborations are being pushed. But just as monopoly power over public discourse can be pernicious even when exercised for ostensibly beneficial ends, an opaque cartel may be no better. This part discusses four key ways informal and unregulated cartels threaten to exacerbate underlying problems in current content moderation practices.
Because of this potential for abuse, building in third-party oversight and accountability mechanisms from the start is essential. The GIFCT example shows that when institutions are set up as reactions to particular crises, the institutional design may not serve longer-term or broader interests.
Companies similarly appealed to the legitimacy of the GIFCT in the wake of the Christchurch massacre as evidence of their commitment to fighting the spread of violent footage.103 103. evelyn douek, Australia's 'Abhorrent Violent Material' Law: Shouting 'Nerd Harder' and Drowning Out Speech, Austl. L.J. (forthcoming 2020) (manuscript available at _id=3443220). But when GIFCT members boasted that they had added over 800 new hashes to the database, there was no way to verify what this meant or whether it was a good marker of success.104 104. Liz Woolery, Three Lessons in Content Moderation from New Zealand and Other High-Profile Tragedies, Ctr. for Democracy & Tech. (Mar. 27, 2019), -lessons-in-content-moderation-from-new-zealand-and-other-high-profile-tragedies [ -Z6J7]. There was, for example, no way to know if these included legitimate media reports that used snippets of the footage, or completely erroneous content, or what proportion of the variants of footage uploaded the figure represented. These deficiencies repeated themselves in the wake of the Halle livestream, even as the platforms were congratulated for their effective response.105 105. Uberti, supra note 16.
Those concerned with monopoly power over public discourse should similarly be concerned about the rise of content cartels. But is it possible to keep the baby of helpful collaboration and throw out the bathwater of harmful cartels? In some areas and for some problems, platforms working together can be beneficial. But in which areas and how platforms collaborate is as important as that they do.
The appropriate level of cooperation is not an easy question. It might be tempting to say that cartels are always too great a threat to diversity in the marketplace of ideas, but such a response exacts a large cost. If you accept that there need to be standards for speech online,124 124. The staunchest free speech advocates may notaccept this proposition. But this is now a minority position: The idea that online speech should be left to the marketplace of ideas alone is a distinctly unfashionable idea, and not one I subscribe to. One can accept that there is both an element of moral panic and risk of overreaction in current debates while also acknowledging that there are areas where effective and responsible content moderation is a necessity. it is difficult to defend the proposition that these standards should not be enforced effectively. In cases where a lack of coordination means simply moving the problem around or, worse, smaller platforms either using blunter tools or not moderating at all, collaboration could be a boon. Should smaller platforms be denied the technology to remove violent propaganda on their platforms (which in many cases ends up there only after being banned by major platforms) in the name of marketplace diversity? Should these platforms be forced to choose between less nuanced hate speech detection tools or a species of free speech absolutism to avoid cartelization? What if this leads to exactly the kind of echo chambers that are most concerning? Homogenization of the public sphere cannot be our only concern.
Between these two ends of the spectrum are the hard cases. Ultimately, the answer should depend on an empirical inquiry into factors such as the prevalence of that category of content; the accuracy of the relevant technology; the cost and practicality of small platforms developing similar tools; the relevant risk of harm; and, especially, the contestability of the category definition and whether it implicates speech, such as political speech, that is ordinarily highly protected. More research is needed for a true assessment of social welfare costs and benefits. This requires greater openness from companies (with a nudge from regulators, if necessary). In the meantime, in these cases, ad hoc, opaque cartelization should not be encouraged.
c80f0f1006