The Facebook Trap

The Facebook Trap

by Bloomberg Stocks
0 comments 349 views
A+A-
Reset

Facebook has a clear mission: Connect everyone in the world. Clarity is good, but in Facebook’s case, it has also put the company in a bind because the mission — and the company’s vision for creating value through network effects — has also become the source of its biggest problems. As the company moved from connecting existing friends online to making new global connections (both examples of direct network effects) and now to connecting users to professional creators (indirect network effects), it has come under fire for everything from violating individual privacy to bullying small companies as a monopoly to radicalizing its users. Now, it is struggling to find solutions that don’t undercut its mission. The author calls this “the Facebook Trap.” To address the problems created by the platform — and by other social networks, too — it helps to clearly establish where the company should be held accountable. While it’s reasonable to push for changes in how Facebook’s recommendations work, it’s harder to decide how the platform should deal with organic connections, which would likely entail censoring users and blocking them from making connections that they want to make. Facebook isn’t the only company facing the conundrum of needing to undermine its own mission to minimize harm, and companies and governments will need to develop strategies for how to deal with this issue.

Founded in 2004, Facebook’s mission is to give people the power to build community and bring the world closer together. People use Facebook to stay connected with friends and family, to discover what’s going on in the world, and to share and express what matters to them.

 ― Facebook Mission Statement

Our mission is to connect every person in the world.

― Mark Zuckerberg, CEO and Co-Founder of Facebook

Depending on who you ask, Facebook’s biggest problem might be almost anything. Critics have argued that it’s violating individual privacy or bullying small companies as a monopoly, damaging teens’ mental health or inciting violent insurrections — the list of possibilities goes on (and on). But varied as these troubles may seem, they are actually all facets of one big, fundamental problem that is staring all of us — policymakers, general public, and Facebook’s own employees — right in the face.

Facebook exists to “connect every person in the world,” as CEO Mark Zuckerberg himself will clearly and frequently pronounce. At face value, there is nothing wrong with that goal. In fact, it is exactly the kind of strategic clarity that strategy professors would like to see from more companies. As the guiding vision of Facebook leadership, this aspirational ideal has been deeply ingrained into Facebook’s company culture. Importantly, connecting people is the fundamental basis on which Facebook has been so successful over the last 15 years.

In my course on technology strategy, we teach students that the most important driver of value creation today is network effects: My own value from using Facebook — and Instagram, Messenger, and WhatsApp — grows as other users adopt and use Facebook. By following through on its mission to connect people, Facebook facilitates a huge amount of network effects that have propelled its organic growth and reinforces its dominant position in social networking.

But as we are all experiencing today, that core purpose leads to myriad negative impacts on all parts of our society. That mission of connecting people is also destroying people’s lives and threatening our established institutions. Facebook faces a monumental challenge, because fixing these issues isn’t as simple as adding more moderators to watch for hate speech or changing the news feed — it will require a fundamental shift in the company’s core strategic goal. In that sense, Facebook is trapped: Network effects made the company a success and now they’re threatening to unmake it, but the company can’t just turn off the engine that makes it work. So, what can it do?

The Inevitable Evolution of Facebook

Following its core purpose means Facebook must continue to connect people and to connect them in more intense ways. It can grow the user base and connect more people who otherwise wouldn’t have been connected, and it can get the existing user base to connect more intensely by using Facebook more, i.e., by increasing engagement. Both these things directly drive advertising revenue, the predominant mode by which Facebook captures value, i.e., monetizes the user base that otherwise uses Facebook for free, something I have written about with a coauthor.

The issue is that even if Facebook were only motivated to create value for users — through connecting people — without any incentive to capture value through advertising, it would still be on the road to disaster. The difficulties it faces are a fundamental consequence of connecting people.

To understand why, let’s consider how Facebook has changed since its idyllic early days.

Old Friends

Initially, Facebook connected users to their real-life extended social circle — their local connections. As a millennial, I joined Facebook in high school as an extension of the friendships I already had. In this world, Facebook facilitated direct network effects, or reciprocal content generation between parties: I create content for my friends, and my friends create content for me. I would post some prom photos, my friends would post slightly different prom photos, and we would all comment on how great everyone looked. Even if someone looked bad in a photo, no one would ever write that: We still had to see each other in real life.

This version of Facebook had some major limitations. First, it didn’t really give me access to anything I didn’t already have in my life. When I became interested in DJing — a niche interest — I couldn’t connect with other DJs on Facebook, because I didn’t have any in my immediate network of real-life friends. Second, there was a finite amount of content — there are only so many prom photos. Third, regular users don’t have the resources to generate “high-quality” content — no one was professionally airbrushing all these prom photos. In this world, the connections were relatively weak, in the sense that they do not optimize for intense and ongoing engagement that keep me using Facebook.

New Friends

Facebook solved this problem by bringing on millions — and eventually billions — of users and then facilitating global connections. Suddenly, through the stronger network effects of a larger user base, users with niche interests could connect and reach a critical mass. There are plenty of other DJs on Facebook and Instagram for me to connect with.

These global connections aren’t always good, however. Users with dangerous interests — to oneself and to others — can easily connect with one another and reinforce those interests. A user with suicidal thoughts may now seek advice from others with the same thoughts. A user with racist views can choose to be surrounded by other racists. And once connected, these users gather together at a critical mass and can coordinate activities. This can range from the relatively benign but still damaging, such as multi-level marketing schemes, to the coordination of events such as the January 6, 2021 attack on the U.S. Capitol, which was organized across many social networks, but was stoked by online communities of users drawn to conspiracy theories about election fraud.

No Friends

There’s another important shift that’s happened, too. As Facebook has evolved, it has begun to rely heavily on indirect network effects. Instead of peers reciprocally generating content for one another, a large user base of content consumers incentivizes the “professional” content producers to keep pushing out content, and the professional content keeps the large user base on Facebook and engaged.

Relying on professional content producers to drive indirect network effects has a number of damaging consequences. First, it encourages elite individuals — celebrities or quasi-professional “influencers” — to portray an unachievable body image and lifestyle as otherwise normal, which Facebook’s own research finds can exacerbate depression, anxiety, and suicidal thoughts in young people. Second, it professionalizes the generation of “clickbait.” Both traditional media businesses and all-out bad actors have the incentive to pump out content and headlines that exploit the curiosity and emotional reaction of users. Third, it empowers professional extremists to spread explicitly dangerous messages at scale. ISIS has used Facebook effectively for its recruiting efforts by sharing videos of grotesque violence that resonate with disaffected youth.

The challenge for Facebook, and for us as a society, is that everything Facebook can do to solve its “problem” works directly against how it creates value and its core mission. In essence, critics of Facebook are asking it to connect less people and connect them less intensely. But that violates the core ethos of what Facebook has always set out to do. This is the Facebook Trap.

The Accountability Challenge

So what is Facebook to do about this problem? And how much of this problem can we as a society actually hold Facebook accountable for, through public pressure, regulatory policy or other means? To answer these questions, let’s consider Facebook’s role in facilitating user-originated connections vs. algorithmic-originated connections.

User-originated connections are the direct interactions between parties that the platform facilitated in the beginning. When Facebook started as a registry of Harvard undergraduates, a user could scroll through all the other students and choose to connect with the few that the user wants to see content from. A Facebook with only user-originated connections would be limited to fairly local connections and more of the direct network effects.

However, as a platform scales, it becomes harder and harder for a user to sift through and find the connections valuable to them. To ensure Facebook could continue to effectively connect people, it deployed algorithmic-originated connections. This recommendation engine uses the data users give the platform to suggest new friends and groups and populate the newsfeed and search results. This heavy hand is necessary to allow global connections to form and indirect network effects to come about, and to bring users the connections they want and would engage with most intensely.

Why the Distinction Matters

Separating out which issues are a result of organic user-originated connections vs. Facebook-driven algorithm-originated connections gives us a sense of what Facebook can reasonably be held accountable for. Unfortunately, it doesn’t present easy solutions.

The scenarios where Facebook uses a heavy hand to facilitate connections are where we can rightfully look for some accountability — even if doing so works against Facebook’s mission. For instance, just because the data says that others really like being connected to incendiary parties or content does not mean that Facebook has to bring that content to my attention. The choice to not expose users to new content they wouldn’t have gone looking for is a relatively straightforward one.

The question of accountability becomes less clear when we consider whether the engine should recommend connections that a specific user actually wants, as revealed by data on the user’s own activity. Facebook’s mission implies that it should intentionally facilitate these connections, but these connections can intensify a user’s behavior and worldview. If a user with mild political leanings shows interest in reading about national politics, how much political content can Facebook recommend before it becomes extreme or even dangerous? Yes, Facebook can limit how it makes these recommendations — if only because individual users cannot hold themselves accountable — but there is no obvious line in the sand for Facebook to draw here.

But clear accountability goes completely out the window when users are making connections on their own. To deal with the problematic user-originated connections, Facebook would ultimately need to censor content and ban users that create the content that we deem problematic. There are some bright lines — of course explicit planning of violent activity should be barred — but the bulk of the potentially damaging content falls into a massive grey area. Consider the dark gray area of anti-vaccine content: If, say, we want Facebook to censor explicit misinformation, what should be done about nuanced, evidence-based content that describes a vaccine’s side effects? Facebook can adjust its algorithm to suppress recommendations of this content, but if users are going out of their way to find it, can or should Facebook censor it? Do we want it to?

The is the area with which Facebook struggles the most. The company has repeatedly been inconsistent and non-transparent about how it censors content. Zuckerberg has tried to defer responsibility to a quasi-independent oversight panel, but critics accuse Facebook of intentionally not giving the panel the resources or control to do its job comprehensively and effectively.

But this evasiveness derives from the accountability challenge intrinsic to social networking. Yes, we can hold Facebook accountable for what Facebook goes out of its way to connect us with. But can we hold Facebook accountable for what we go out of our way to connect with? And as a company dedicated to connecting people as its mission, Facebook clearly does not want to be accountable for the connections that users genuinely want, independent of whether Facebook gives it to users or the users find it themselves.

What Can Facebook Do?

As a strategy professor, I am probably more empathetic to Facebook than most. Facebook has a strategy of connecting people that has created a tremendous amount of value, but that same strategy is getting Facebook into a lot of trouble today. There are hard tradeoffs on all sides. My view is that there is no clear solution, but there are three broad routes that Facebook can pursue, potentially in conjunction.

Communicate the Tradeoffs, Transparently.

In past efforts to project responsibility, Facebook has implied that it has solutions to the problems it creates, which at present it doesn’t seem to have. As one route, Facebook can be more transparent about the fundamental tradeoffs that come with social networking by releasing research that documents specific issues, like with body image and Instagram, alongside its ongoing advocacy for the value that comes with connecting people. These insights can guide regulators and put Facebook in a good position to take regulation in a favorable direction for the industry, and regulation that imposes costly compliance requirements can be a barrier to entry that protects incumbents like Facebook, e.g., GDPR in Europe.

Ramp Up Moderation, Massively.

To comprehensively moderate all its content, Facebook would need to continue advancing the frontier on algorithm detection of undesirable content and increase the number of human moderators by an order of magnitude (or multiple). As of 2020, Facebook employs 15,000 human moderators that each view hundreds of content items daily, and it will need many more. This effort will cost billions of dollars, and perhaps more painfully for Facebook, force it to decide what content to restrict: curating for one person is censoring another. However, no moderation effort can do much about the content running through encrypted WhatsApp or Messenger communications.

Be Accountable, Appropriately.

Facebook needs clear boundaries on which aspects of its platform it wants to — and can be — accountable for, and clearly delegate accountability to governments, independent agencies, and users where it doesn’t. On algorithm-originated connections, it will be impractical to delegate accountability on what is often a black box process — and this technology is a core piece of intellectual property for Facebook — so Facebook needs to be ready to take responsibility on what connections that algorithm promotes.

But on user-originated connections to undesirable content, Facebook has been unclear about who is accountable here. The quasi-independent Oversight Board moves Facebook towards this direction of delegating accountability, but it is still evasive and incomplete: The board only reviews Facebook content decisions after the fact on appeal, and the board is still financially dependent on Facebook and too small to operate at scale.

Moving forward, Facebook can itself take on genuine accountability by massively ramping up its own moderating efforts; publicly and credibly give that accountability to an outside authority; or leave that accountability in the hands of individual users by taking a stand and fighting for its original mission of connecting people freely however they want. Right now, Facebook is ambiguously doing all three, leaving no one accountable at the end of the day.

Bigger Than Facebook

Facebook serves as a convenient lightning rod for ire, but Facebook could disappear off of the face of the earth tomorrow and we will still face these problems again and again. The Facebook Trap is intrinsic to social networking as a whole, and reflects the consequences of digital technology facilitating a more connected world.

Twitter has evolved on the same path as Facebook towards using algorithms to connect people globally, imparting many of the same adverse consequences as Facebook. Snap(chat), originally reliant on connecting friends, drastically redesigned its platform to drive indirect network effects that increase the amount of time users spend watching professional content. TikTok has rapidly become a powerhouse by using its best-in-class algorithms to connect users to the most engaging content globally without having to build from a network of real-life friends.

We all need to reckon with the consequences of what it means to connect more people more intensely. To do that, and navigate this trap we’re in, Facebook and all the social networking platforms today (and yet to come) need a clear sense of what they will be accountable for. It’s time these companies — along with governments and users — tackle the Facebook Trap head on.

Read More

You may also like

Leave a Comment