Facebook Is Not the Problem
I changed my mind about Facebook. Here's why you should, too.
Criticizing Facebook is an easy game, fashionable for leftist Twitterati and red-state legislators alike. Never has there been an anti-Facebook article that didn’t have an eager audience waiting to click and retweet it on the other side. It’s an equal opportunity Rorschach punching bag.
Of course, it must be acknowledged that Zuckerberg and company have brought much of this upon themselves: carpetbagging the entire 2016 election in favor of Trump, openly fostering foreign influence on the platform, selling its users’ data to the highest bidder, implementing a ruthless attention-extraction architecture at all costs and sowing the ground for conspiracy and institutional distrust—all while remaining seemingly indifferent to the online abuse, teenage bullying and mental health impact it fosters. They also take what appears to be a pretty laissez-faire approach to white supremacist movements and homegrown terrorist networks.
FB’s sins as a company are many, and consequently, it has become the platform everyone loves to hate. I myself was a proud, card-carrying member of that group for a long time. That is, until recently.
After the 2016 election, I, like many others, was furious with Facebook. I felt Zuckerberg had betrayed the public trust and sold America out. He was all too eager to let foreign actors through our electoral front door as long as they had dollars in hand—or in this case, rubles. Judging by internal leaks and external public statements, Zuck seemed oblivious and indifferent to the impacts that his network was having on functioning democracies everywhere.
But while my anger at Zuck felt righteous and deserved, I also wanted to understand what could be done about it. I felt uniquely positioned to understand the problem. I had an extensive background in postmodernism, deconstructionism and Sovietology. As an undergrad, I studied philosophy of mind, artificial intelligence and evolutionary psychology. I spent time in Eastern Europe interning and working for NGOs combating Soviet influence campaigns against pro-democratic politicians and movements. And I have run a successful production company crossing psychometric data with storytelling techniques using neuroscience frameworks to maximize audience impact and content virality.
Countries earmark budgets to combat and secure these types of networks so it is, as a general rule of thumb, foolish to think one person can wrap their head around the entirety of it. Understanding these platforms, and their problems, requires both an incredibly generalized foundational knowledge and intense specialization.
With that in mind, I spent the last four years making this my obsession. I talked to as many experts on the subject as I could. I annoyed former and current FB employees to give me their takes. I dove deep into game-theoretical models, complexity theory and memetic transmission. I wrote papers on meme/gene coevolution and how social media platforms mirror biological ecosystems. I read all the Facebook hit pieces and disgruntled takedowns while also suffering through the puff pieces and the self-congratulatory autobiographies. And with the help of other experts, I developed the rough outskirts of what a healthier social media network could look like if implemented at scale.
And at the end of that journey, I have netted out as cynically sympathetic towards Facebook. I now see that many of the criticisms levied against the tech giant are laced with the same naivete that we were—and still are—so right to skewer Zuckerberg for having in the first place.
I’m not trying to defend Facebook per se, but to push back on Hollywood celebrities and public intellectuals who righteously and errantly are calling for Facebook’s dismantling. It’s an effort to highlight how complicated and dangerous of a world we now find ourselves in, with respect to social media, and how removing Facebook from the equation does little to make that world any less complicated and dangerous. In fact, it could very likely make it worse.
Tik-Tok vs. the Devil You Know
It surprises me that for all the liberal epithets levied at Facebook over the past four years, TikTok’s meteoric rise to social media rock star status by the intellectual class has been viewed as almost a quaint afterthought instead of what it really is: a national security threat.
TikTok has shown us that social media monopolies are not nearly as robust and resilient as we pretend they are. To equate Facebook to Amazon and Apple is to misjudge market monopolies entirely. In the social media world, all it takes is a million teenagers egged on by a few celebrity pop stars to start a new party on a different, newer platform, and bring the lion’s share of future generations’ social exchanges with it. Any notions that assume this newer, sexier social platform will somehow behave more “ethically” are not tethered to any sort of realistic political or economic reality.
We saw this play out compellingly with Google’s forays into China. They learned firsthand you either play by Beijing’s censorship rules or they will copy your tech, scale faster with a larger user base and take market share away either way. Google saw the writing on the wall and ultimately acquiesced to China, adding a bit of an Orwellian flavor to their old "don’t be evil” mantra.
For as much as it is difficult to parse the intelligence and folly behind any Trump administration foreign policy directive (the “broken clock is right twice a day” doctrine), the initial intention to push back against TikTok’s gatekeepers is mostly the right one. Psychometric user data is the oil of the 21st century, with national security implications that lay far beyond the scope of what brand of jeans you like to buy online. And unlike most natural resources for which geography is the focal point for political conflict, personal data as a “precious mineral” is perpetually up for grabs in an explicitly non-geographic cyberspace. Right now it’s free for the taking by any kleptocratic or authoritarian regime with enough data scientists and engineers to pull the strings behind the scenes—no tanks and troops needed. The key point here being that it’s not what these companies and governments can do with your data today, but rather what they will be able to do with it tomorrow.
And while Facebook’s ethical posture leaves much to be desired, their speech policy is largely mirrored after a U.S. constitutional interpretation of free speech. It’s still largely run by employees who don’t seem particularly eager to have history blame them for the downfall of liberal democracies. And as a company, it’s still largely beholden to American legislators, who, if they can get their act together, might be able to apply just the right kind of soft legislative pressure that brings about healthy reforms to the platform. So for as callously and flippantly as Zuckerberg has responded to justified public critique, it should be acknowledged that he is even responding at all. This is more than could be said for a TikTok or Huawei, whose data-extraction has a particular CCP flair to it.
The irony then should not be lost on any of us that at the same time that Facebook has publicly committed to taking down QAnon groups, banning Holocaust deniers and outlawing anti-vaccine advertising, China is using their data-capture capabilities to round up Uighur Muslims into concentration camps and arrest Hong Kong pro-democracy dissidents for daring to speak even a single ill word against the Party. And this is the point of the "Devil You Know" game. The political distance between publicly traded companies and the government inside of the U.S. is typically defined by hostility, distance and stubborn reluctance. (See: Tim Cook planting his feet in the ground on user privacy vs. U.S. law enforcement). Conversely, in places like China and Russia, the lines drawn between business and government are superficial distinctions without a difference. When it comes to data collection, the consequences of those disparities couldn’t be starker.
There is a global digital cold war being played out here, blending the lines between geopolitical soft-power games and legitimate nation-state spying. Judging by the blistering speed in which we brought TikTok into our homes, most Americans seem content to sleepwalk their way through it. We would be wise to pause then before throwing out Facebook in lieu of the shiny new thing, especially when that thing has a "Made in China" sticker on the back.
Social Media Giants Are Drug Cartels
Liberal democracies depend on consensus-making to function. Consensus-making that is predicated upon a shared set of starting points before compromising on idealized endpoints. Those starting points are forged out of some agreement about a set of reality approximations, or what we might call “news.” If you can’t agree that up is up and down is down and two plus two equals four, then the idea of “compromise” as a political goal becomes irrelevant.
Right now, those reality approximations are constrained to three marketplaces: FB (including Instagram), Snap and Twitter, all of them acting as de facto drug cartels. They deliver a product in high demand. Each of the cartels is eager to take the rival cartel’s market share if the opportunity arises. The demand curve for their products is now fixed, the country is addicted to social media, and while we figure out how to wean ourselves off this digital drug, our attention has turned to the supply side of things.
And so the first question we should be asking ourselves is what happens when you take out the cartel heads? If we use Mexico and Colombia and the war on drugs as analogs, what invariably happens is that rival gangs move in. Sectarian violence ensues. Territory battles spiral out of control. You have power vacuums where chaos and instability are the order of the day. (See also: Iraq, Libya, Egypt, ISIS, etc.)
Now imagine instead of drugs being sold, it’s the stock exchange for where reality approximations are traded, the ecosystem that gives life to those starting points that liberal democracies converge upon. A fracturing of the cartel heads could very well increase the epistemic divide between dissenting parties. The ensuing power vacuum and proliferation of mini “cartels” may only increase volatility and chaos in the system instead of mitigating it. If liberal democracies depend on a sense of shared reality in order to function, it is not hyperbole to assume that a hundred different Facebooks in the system would only increase our collective political and ideological divides, not heal them.
If this sounds all a bit theoretical and esoteric, try to envision a world where Call of Duty players, Manchester United soccer fans and Hello Kitty collectors all get their own separate news specially tailored for them. News aggregated not on a single platform (FB), but on individual platforms that never cross paths with each other, and whose only goal is to serve their users “news” that keeps them coming back to their portals. You do not need a political science Ph.D. to venture a guess as to where this all leads.
Facebook Isn’t Democracy (and That’s a Good Thing)
Churchill is famous for remarking, “It has been said that democracy is the worst form of government except for all those other forms that have been tried.” At heart, this is an argument about an often overlooked feature of systems design known as path-constraints. The idea that the success of a designed system has more to do with economic forces, biological constraints and the laws of physics than the execution or application of the idea itself. Said another way: There are a lot of ways to build a rocket ship that crashes and a very limited number of ways to build one that goes to the moon.
The point here is that the biggest structural problems that social media pose to liberal democracies really don’t have anything to do with the uniqueness of Facebook per se, but rather are germane to all social media platforms, everywhere.
The reality is that any other platform that steps into the void following Facebook’s demise is likely to encounter the same big picture meta-problems that Facebook is already struggling with currently. Reddit and Twitter have arguably navigated these waters in a more ethical way, but the problems are no less glaring and their solutions are no more elegant.
There is little doubt that had MySpace, instead of FB, risen to global dominance, we would be having similar conversations about MySpace’s platform inadequacies. Fear, outrage and sensationalism sell. All of the platforms require turnover and engagement to stay profitable. These path constraints suggest that even with the noblest of CEOs and the most ethical of corporate boards, most improvements brought about by any new “ethical” platform would only be at the margins. (Granted, these margins are still vitally important.)
If we’re going to storm Facebook headquarters with pitchforks and torches in hand, it would be nice to know what is going to be built in its place and in what ways specifically it will be better than what exists already. In what ways are the new improvements invulnerable to the very same outcomes that awaited Facebook? In what ways can we be sure the new model won’t create an even more volatile and destructive set of unintended consequences on the back end?
And so Facebook isn’t “democracy” in the Churchillian sense. It’s just a profit-seeking firm. Moreover, many of Facebook’s missteps are ones we are right to demand civic atonement for. But we should be especially wary of those critics proclaiming that a better platform would exist if only we could just get rid of Zuck and his evil empire. We should be skeptical of those public figures on both the left and the right who are convinced that a perfect platform architecture and policy exists. How do they know what this magical solution is? Like all empty utopian promises, they promise you that we’ll figure out the details once we get there.