The Case Against Facebook from Evolutionary Biology

The Case Against Facebook from Evolutionary Biology

So, what are we going to do about Facebook?

It has been clear for at least five years now, or should have been, that the status quo is untenable, not least because the era of social media seems to have degraded our ability to have grown-up conversations about important problems, and one of those problems is social media itself. Thus, there has been no consensus about what to do about the problem, mostly because we can't agree on what the problem is in the first place. Much of what I will say here applies to other platform social media, especially YouTube. Facebook is now a bit like McDonald's: they aren't the only problem we have, but they are the biggest, and most objectionable, fish in the barrel.

The grievances range from the very specific to the very broad. The tech companies are too big. Hate speech is rife on the platform. There are filter bubbles. There are fake accounts. There is the question of whether data privacy rights should be able to be waived even if one agrees to the terms and conditions. There are Facebook's own experiments in human psychology, done without subject consent. And there is the apparent increase in teen suicide, long suspected and now known to be caused by use of platform social media. That last has been enough for me to give a hard "no" to any social media use by my daughters, so far (thankfully they are not yet old enough for this to make them social pariahs).

Each of these grievances is specific enough that we can attach hypothetical solutions. Even if governments seem unwilling to act, at least the possibility is there. I think they all miss the point. They are all symptoms of a deeper problem, or are ancillary to it. As engineers are fond of saying: what is the real problem we're trying to solve here?

In the days before the internet, companies that dealt in information generally fell into one of two categories. Firstly, they could be publishers. The New York Times might print "all the news" that was "fit", but it was on some level responsible for what fell into that category. Yes, journalists are human beings and they make mistakes, and occasionally some of them were guilty of actual fraud. Like science done right, journalism done right has a tendency to self-correct and we shouldn't fixate on exceptions when they were clearly that.

The second category of information-dealing companies was as a distributor, such as the postal service. Not only was the postal service not responsible for what got sent through the mail, they weren't even allowed to look, except in extreme cases. The key idea is that they were common carriers: one piece of mail doesn't have any greater likelihood of reaching its destination than another piece of mail. The principle of common carriage has a long history when applied to information companies: it was the key reason for breaking up AT&T. It is really an older name for "net neutrality", and why we rightly worry about internet service providers becoming too monopolistic.

Facebook (and Twitter, YouTube, etc.) want to have it both ways. They don't want to be a publisher. No one wants to be a publisher. They're the guys Facebook is disrupting. But neither do they want to be a common carrier. Not being able to pick and choose what you see on your timeline means most of it won't be interesting to you. The experience of being on Facebook would be too much like going through your mail (or worse, your email). Facebook has it both ways: they are not a publisher because they don't censor your speech by choosing what you can post. But they are not a carrier because they still choose what you can see.

The Algorithm is where things go wrong. To see why, we need to temporarily suspend our belief in ourselves as free agents reading, posting, and sharing what we choose. This is not say that we don't each individually have free will, just that it's a funny thing: each of us knows we have it (I just shouted "peanut butter" apropos of nothing, just to remind myself) but it tends to average out en masse. No individual human believes their behaviour or thoughts are controlled by that article someone shared, but on average, a widely shared article influences people. No one believes they bought a car because they saw a commercial, but show a commercial to a million people and sales will go up.

Instead of thinking of Facebook's users as the agents that matter, think of the posts themselves as agents. Really, it isn't the posts that matter, but the ideas contained within them. They can survive by finding their way to the top of a lot of users' feeds. They can reproduce, selectively, by being re-shared, re-mixed, or re-posted on another platform. They can mutate. And what do we have when information reproduces and mutates?

The idea of ideas themselves spreading and mutating is not totally new. The word coined by Richard Dawkins in 1976 to describe this phenomenon was "meme", or a "mental gene". This word has been co-opted in the internet era to refer to something more specific, but we have to reclaim its original meaning because it is the best term available for what might really be going on.

If memetics is the correct mental framework for thinking about Facebook's problems, how would we know? What predictions does the hypothesis make? Firstly, memes sometimes spread with shocking rapidity, like biological viruses, and we have internet-era terminology to describe that, too. But I think the more telling result is the spectrum of truly bizarre forms that have been, and are being, evolved. Anyone who has studied the natural world can tell you that biological evolution has produced some really strange things. Memes are now struggling to survive and reproduce in a totally novel environment, and the organisms that are succeeding are not ones we could have predicted in a world of newspapers and postcards. QAnon may seem like literal insanity to someone not familiar with it, but perhaps it is just highly adapted to its environment. Perhaps "raving lunacy" isn't a trait that counts against reproductive fitness anymore, while "being true" doesn't mean as much as it used to.

The bottom line is, we don't know enough right now to confirm or exclude this as a hypothesis. The Algorithm and the data it feeds on are closed to us, and we can only speculate.

So what can we do? If memetics is right, attempts to tweak The Algorithm won't work: it might introduce different selective pressure, but the basic capacity for selection will still be there. Since the goal of every meme is to make copies of itself, it can do so whatever the parameters applied to it. Similarly, censoring hate speech will merely kill off those memes which qualify, while opening a broader ecological niche for everything that just barely doesn't. Breaking up Facebook (or Meta) by spinning off Instagram and WhatsApp is a dead-end: QAnon and its ilk are hardly confined to Facebook. Meaningful data-privacy rules would be great, but as they would mean Facebook has no business model, that's a bit of a self-fulfilling prophecy.

I believe the right approach is to break up the company, but not by spinning away its subsidiaries. There is a need in society for both information publishers and carriers, just not within the same organization. If Facebook wants to be a platform, they should be neutral platform where content can be published. If this content is available in a machine-readable format, then someone else (or hopefully many someone elses) can produce an algorithm to sort it into a timeline. Open data would finally mean competition among both platforms and algorithms, and it would mean researchers could find out what's really going on. Of course, if someone wants to recreate the Facebook experience and use an algorithm that merely confirms everything they already believe, no one can stop them, but they would do so knowing they are asking to be deceived.

We don't want to know how deep this rabbit hole goes. We don't want to know how good memes can get, or how well they can adapt to hijack our own powers of reason. Like viruses, they are capable of evolving much, much faster than we are. Perhaps our immune systems will adapt, just as they may have after the printing press, but I don't think we have the luxury of waiting to find out. We need vaccines, and we need them now.

Like this post? Start a conversation, browse my other writing, or view my portfolio.