In the wake of the 2016 US elections there are volumes of conversations taking place over our possible future, the ongoing tension and conflicts, and the root causes of the election. The causes are numerous and not simple to categorize–your perception of the election’s results may make you view one cause as a positive or a negative, for example. But the causes are out there and, if they did not have the impact they desired, then the results of the election compel those causes to re-examine their purpose and impact. Facebook is one of them.
The New York Times ran an excellent article about Facebook’s possible impact and how different groups within the organization are thinking about the issue. If you’re unable to read that article because you don’t have a subscription to the New York Times, I’d suggest you subscribe. The article also points out the conflicting viewpoints even within Facebook, as Mark Zuckerberg has publicly posted that he believes Facebook’s involvement is minimal. Unfortunately, I believe Mr Zuckerberg’s comments have missed the forest for the fake tree.
Mr Zuckerberg’s post talks about the potential impact of fake stories that circulated on Facebook. He believes those stories had no impact, but that also once you go down the road of trying to mark stories as true or fake you get into dangerous territory. Even mainstream reports may omit details or sometimes get stories wrong. That is entirely valid criticism and it is entirely hogwash.
Certainly you can draw the line at marking what is a real or fake story and you can argue about moving that line. Right now, no such line exists. That allows completely fabricated stories to gain widespread circulation perpetuating their untruths. Once that bad information has taken hold it is almost impossible to eliminate their impact, as Facebook well knows with the constant resurgence of Facebook untruths (Facebook is going to start charging you, if you post something then you keep control of your content, they now own all of your photos, etc.). Even if another true story circulates right after the original fake story you will still have a large number of people who think the fake story may have had a detail wrong but the overall theme is true. And of course that has an impact.
Facebook and other social media sites have become widely popular for lowering the barriers of distributing content. We can now connect with people and share information with simplicity and ease. That has powerful positive effects but it also has some drawbacks. The widespread dissemination of fake news is one drawback and that can be addressed by Facebook if it wanted to do so.
But there’s a bigger picture here, one that I fear Facebook is missing by only talking about fake news. Because the true impact of Facebook and all of social media isn’t just about fake news but rather that these platforms designed to increase communications between people may be doing the opposite. There is a wealth of articles and research about how the same technology that gives us access to so much content may also force us into a bubble of only content that we agree with. The most recent iteration is how this may have impacted the election, such as this New York magazine article points out, but this is an older concept as this fantastic 2011 TED talk points out (carve out 9 minutes to watch it if you can).
This is where Facebook can best start looking in the mirror. Because Facebook doesn’t just set up bubbles for its users, it is a bubble generating machine.
Facebook stays successful by making sure you keep coming back. It wants to give you content you find compelling and enough new material so you visit the site many times a day. It also can’t give you too much content or you’ll get frustrated and leave. And it also can’t give you content that will make you never come back–whether because you found it offensive or distasteful or any number of reasons.
This is the entire reason for Facebook’s Edgerank algorithm and why you sometimes see articles complaining how Facebook users don’t see all their friends’ posts. Facebook constantly tweaks and plays with this program to maximize your time on Facebook. More time on Facebook means you keep coming back and you’ll see more ads that they can sell to fund the platform. That makes sense from a platform and business perspective.
But as a content and media company, Facebook also needs to ask if maximizing user bubbles is truly in the best interests. Compare this to a snack food company that discovers if they add more sugar then people like the snacks more, they consume it more, they buy more of it. That makes sense from a business perspective and yet it may not be the best possible outcome.
Facebook and others need to look themselves in the mirror and decide who they want to be. They can take the all business approach of doing what is the best for profits or they can decide there is a greater responsibility at play. I don’t know how to burst those bubbles if Facebook chooses to do so. I do know that Facebook has some of the most brilliant content engineers, data scientists, and platform designers on the planet. If they want to address this problem, they can start coming up with solutions. Because bursting those bubbles may be vital in helping to bring people together, to help us increase understanding of problems and come up with solutions. Popping those bubbles may help heal the polarizing partisanship that has only grown over the past years.
Those bubbles may be nice to live in, but they may choke us in isolation. It’s time to figure out whether they’re worth keeping.
Either way, Facebook needs to look at their role in defining public conversations and make a decision. Sticking their head in the sand and pointing at the other causes is irresponsible. No, Facebook isn’t entirely to blame. It also is not blameless. Where it goes from there is entirely within their control.