Zuckerberg’s Ugly Truth Isn’t So Ugly

A review of the 2021 book “Inside Facebook’s Battle for Domination” by Sheera Frenkel and Cecilia Kang. The truth is far more complex.

Writing this review didn’t come easy. I spent five years helping to mitigate and solve Facebook’s most thorny problems. When the book was published, I perceived it to be an attack on Facebook orchestrated by the New York Times, a stock-listed company and direct competitor in the attention and advertising market. Today, I know that my perception then was compromised by Meta’s relentless, internal corporate propaganda.

Similar to Chaos Monkeys, An Ugly Truth tells a story that is limited to available information at the time. The book claims to have had unprecedented access to internal, executive leadership directly reporting to Mark Zuckerberg and Sheryl Sandberg. It is focused on the time period roughly between 2015 and 2020; arguably it was Facebook’s most challenging time. Despite a constant flow of news reporting about Facebook’s shortcomings, the book, for the most part of it, remains focused on the executive leadership decisions that got the company into hot waters in the first place. Across 14 chapters, well-structured and perfectly written, the authors build a case of desperation: in an increasingly competitive market environment, Facebook needs to innovate and increase its user statistics to beat earnings to satisfy shareholders. Yet, the pursuit of significance infiltrated the better judgment of Facebook’s executive leadership team and eventually led to drowning out the rational voices, the protective and concerned opinions of genuine leadership staff over the self-serving voices of staff only interested to progress at any cost.

To illustrate this point, the authors tell the story of former Chief Security Officer Alex Stamos, who persistently called out data privacy and security shortcomings:

Worst of all, Stamos told them (Zuckerberg and Sandberg), was that despite firing dozens of employees over the last eighteen months for abusing their access, Facebook was doing nothing to solve or prevent what was clearly a systemic problem. In a chart, Stamos highlighted how nearly every month, engineers had exploited the tools designed to give them easy access to data for building new products to violate the privacy of Facebook users and infiltrate their lives. If the public knew about these transgressions, they would be outraged […]

His calls, however, often went unanswered, or, worse invited other executive leadership threatened by Stamos’ findings to take hostile measures.      

By December, Stamos, losing patience, drafted a memo suggesting that Facebook reorganize its security team so that instead of sitting on their own, members were embedded across the various parts of the company. […] Facebook had decided to take his advice, but rather than organizing the new security team under Stamos, Facebook’s longtime vice president of engineering, Pedro Canahuati, was assuming control of all security functions. […] The decision felt spiteful to Stamos: he advised Zuckerberg to cut engineers off from access to user data. No team had been more affected by the decision than Canahuati’s, and as a result, the vice president of engineering told colleagues that he harbored a grudge against Stamos. Now he would be taking control of an expanded department at Stamos’s expense.

Many more of those stories would never be told. Engineers and other employees, much smaller fish than Stamos, who raised ethical concerns of security and integrity were routinely silenced, ignored, and “managed out” – Facebook’s preferred method of dealing with staff refusing to drink the kool-aid and toe the line. Throughout the book, the authors maintain a neutral voice yet it becomes very clear how difficult the decisions were for executive leadership. It seemed as though leading Facebook is the real-world equivalent of Kobayashi Maru – an everyday, no-win scenario. Certainly, I can sympathize with the pressure Mark, Sheryl, and others must have felt during those times.

Take the case of Donald John Trump, the 45th President of the United States. His Facebook Page has a reach of 34 million followers (at the time of this writing). On January 6, 2021, his account actively instigated his millions of followers to view Vice President Mike Pence as the reason for his lost bid for reelection. History went on to witness the attack on the United States Capitol. Democracy and our liberties were under attack on that day. And how did Mark Zuckerberg and Sheryl Sandberg respond on behalf of Facebook? First, silence. Second, indecision. Shall Trump remain on the platform? Are we going to suspend his account temporarily? Indefinitely? Eventually, Facebook’s leadership punted the decision to the puppet regime of the Oversight Board, who returned the decision power due to a lack of existing policies that would govern such a situation. When everybody was avoiding the headlights, Facebook’s executive leadership acted like a deer. Yes, Zuckerberg’s philosophy on speech has evolved over time. Trump challenged this evolution.

Throughout Facebook’s seventeen-year history, the social network’s massive gains have repeatedly come at the expense of consumer privacy and safety and the integrity of democratic systems. […] And the platform is built upon a fundamental, possibly irreconcilable dichotomy: its purported mission is to advance society by connecting people while also profiting off them. It is Facebook’s dilemma and its ugly truth.

The book contains many more interesting stories. There were a wealth of internal leaks to desperately influence and return Facebook’s leadership back to its original course. There were the infamous Brett Kavanaugh hearings, which highlighted the political affiliations and ideologies of Facebook’s executive leader Joel Kaplan, who weathered the sexual harassment allegations against Brett Kavanaugh by Christine Blasey-Ford despite an outrage of Facebook’s female employees. Myanmar saw horrific human rights abuses enabled by and perpetrated through the platform. The speaker of the U.S. House of Representatives and Bay Area representative since 1987, Nancy Pelosi was humiliated when Facebook fumbled to remove a deepfake video about a speech of hers that was manipulated to make it sound slurred. And the list goes on and on and on and on.

The book is worth reading. The detail and minutiae afforded to report accurately and convincingly are rich and slow-burning. That being said, Facebook has been dying since 2015. Users leave the platform and delete Facebook. While Instagram and WhatsApp pull the company’s advertising revenue for the time being with stronger performances abroad, it is clear that the five years of the executive leadership of Facebook covered in this book point towards an undefiable conclusion: it failed. 

NPR’s Terry Gross interviewed the authors Sheera Frenkel and Cecilia Kang on Fresh Air. It further demonstrates the dichotomy of writing about the leadership at one of the most influential and controversial corporations in the world. You can listen to the full episode here

Who Holds The Pen? 

Richard Stengel’s memoir illustrates the complexity of modern government.

Richard Stengel served as the Undersecretary of State for Public Diplomacy and Public Affairs alongside the 68th Secretary of State John Kerry. In his memoir “Information Wars – How We Lost The Global Battle Against Disinformation & What We Can Do About It” he recounts his time working for the Obama administration. Arguably, the Obama administration was a forward-leaning government calibrated to modern technology with a pulse on current affairs. Stengel really captures the struggles that even a modern government must overcome. From protocol and etiquette at meetings to the clearance protocol of social media use and other technology. When recounting his efforts to drive the democratic narrative online, combatting bad actors in the process, Stengel observed: 

“One of the things I’d noticed in government is that people who had never been in media, who had never written a story or produced one, […] who didn’t understand audiences or what they liked, seemed to think it was easy to create content. People had the illusion that because they consumed something, they understood how it worked.

This fallacy applies to many more segments of society, not just government. It illustrates how technology is misunderstood by the public who tend to forget that policy decisions and strategy at scale, impacting thousands if not millions of people, are incredibly tough to fine-tune and nuanced at all levels. Stengel offers an example of counter-messaging the Islamic terrorist group Boko Haram social media by leveraging the Center for Strategic Counterterrorism Communications (CSCC). Boko Haram had kidnapped some 276 girls from a secondary school in Nigeria. The idea was simple: show support for the kidnapped girls in an online campaign. Stengel approved the content for the campaign. Ten days later, he found out the content was objected by the Africa bureau. After updating the content with feedback from the Africa bureau, the content was approved but not through the clearance process because the Bureau of Intelligence and Research had objected on those changes. Ten days of silence on social media is tantamount to a lifetime of non-existence. Stengel went on learning that things he’d expect to take hours would take days; things he’d expect to take days would take weeks; things that he’d expect to take weeks would take months. Many more governmental departments default to “No” than to a “Yes”. It really made me think about new ways to improve government. But it is also an urgent reminder that government needs disruption.  

Another interesting lesson from this book is the balance between diplomacy, career development and leadership. His interactions with the Secretary of State John Kerry testify to Stengel’s business acumen despite working for the government. About Kerry Stengel notes:

“He’s permanently leaning forward. That was his attitude about the world as well. To plunge in, to move forward, to engage. There’s no knot he doesn’t think he can untie, no breach that he can’t heal. For him, the cost of doing nothing was always higher than that of trying something.

It’s almost bittersweet to read these lines of optimism considering the slow pace the State Department moved during these heydays of ISIS, Al-Qaeda or Boko Haram all the way leading up to the Russian influence operation to undermine the 2016 US Presidential elections. Then again, Stengel really captured the predicament of the government at the time when he writes:

“What few of us understood at that point was that our opponents– Russia as well as ISIS –wanted us to get into a back-and-forth with them. It validated what they were doing, brought us down to their level, and besides, we weren’t as good at it as they were. They won when they got us to respond in kind.”

Engagement and impressions are everything online. Capturing our attention is the success metric for effective influence operations. This can be an overt diplomatic endeavor, like the Iran Nuclear deal, that sought to bring the United States and Iran a step closer together, or it can be a clandestine operation, like ‘Glowing Symphony’, that sought to deplatform ISIS and eradicate their narrative online. 

Information Wars should have been titled with a more accurate title. Other than that I found Stengel’s memoir quite illuminating when it comes to government processes and how the State Department aligns itself with the current administration. As a journalist-by-trade and former managing editor of Time Magazine, Stengel’s writing style is simple and narrating. The density could have been better. It sometimes feels like a magazine. Across 7 parts and numerous chapters a lot of personal anecdotes and experience dilute the lessons of this book. Without that, this 314 page memoir could have been a concise non-fiction on influence operations and a concise memoir about his life. 

When Did Truth Die?

Michiko Kakutani offers an eloquent compilation that explains the decay of veracity in the United States. But perhaps more importantly, it skillfully weaves together almost a century of painful lessons from history, literature, and politics.

The Death of Truth was highly scrutinized by media publishers, book critiques, and the greater literature community at the time of its publication. Google the reviews. As the title suggests The Death of Truth – Notes on Falsehood in the Age of Trump by Michiko Kakutani advocates for the truth to be added to the list of casualties of the former Trump administration. Reading this book at the end of 2021, almost exactly one year since Joe Biden became the 46th President of the United States, and almost 3 ½ years after its initial release, I can’t help but view this book as a compilation of essays that are really bite-sized opinion pieces. This makes for an immersive, moving reading experience, but also renders the message of The Death of Truth to be the mere same polemic it appeared to seek to quash. Admittedly, a provocative diagnosis of our current political landscape is hardly done in the total absence of partisanship. 

Kakutani brilliantly threads her analysis by starting with a historical review of culture wars and past regimes’ handling of truth. She gradually escalates her storyline to the twenty-first century with humanity’s dependency on social media, algorithmic subversion of political decision making, and foreign actors exploiting the American focus on self-pursuit at the expense of civil responsibilities. In her epilogue, Kakutani warns of the continued erosion of democratic institutions. We, the people, must protect the democratic institutions that uphold the roof of democracy. At the same time, there won’t be any easy remedies or shortcuts that will fix our polarized, cultural division. Times like these require deft civil disobedience of the many that are publicly rejecting the idea of cynicism and resignation pursued by the totalitarian few. 

People who are likely to read this book are unlikely to learn something new, but I believe it’s still worth it for the extensive reading resources provided by Kakutani. Her remarkably colorful writing style and sobering outlook on the future state of veracity in the United States won’t disappoint either. NPR’s Michael Schaub nailed it when he wrote: “The Death of Truth is a slim volume that’s equally intriguing and frustrating, an uneven effort from a writer who is, nonetheless, always interesting to read.”

Twitter And Tear Gas

Zeynep Tufekci takes an insightful look at the intersection of protest movements and social media.

Ever since I’ve read Gustave Le Bon’s “The Crowd”, I’ve been fascinated with crowd psychology and social networks. In “Twitter And Tear Gas – The Power And Fragility Of Networked Protests” Zeynep Tufekci connects the elements of protest movements with 21st-century technology. In her work, she describes movements as

“attempts to intervene in the public sphere through collective, coordinated action. A social movement is both a type of (counter) public itself and a claim made to a public that a wrong should be righted or a change should be made.”

In times of far-reaching social media platforms, restricted online forums, and end-to-end encrypted private group chats, the means to organize a protest movement have drastically changed. 

“Modern networked movements can scale up quickly and take care of all sorts of logistical tasks without building any substantial organizational capacity before the first march or protest. (…) The Gezi Park moment, going from almost zero to a massive movement within days clearly demonstrates the power of digital tools. However, with this speed comes weakness, some of it unexpected. First, the new movements find it difficult to make tactical shifts because they lack both the culture and the infrastructure for making collective decisions. Often unable to change course after the initial, speedy expansion phase, they exhibit a ‘tactical freeze’. Second, although their ability (as well as their desire) to operate without defined leadership protects them from co-optation or “decapitation,” it also makes them unable to negotiate with adversaries or even inside the movement itself. Third, the ease with which current social movements form often fails to signal an organizing capacity powerful enough to threaten those in authority.”

While these movements often catch the general public by surprise, it really does come down to timing and committment by a group of decentralized actors. These actors, who come from all walks of life, seek to connect with others as rapidly as possibly by leveraging the unrestricted powers of social media. Social media creates ties with a variety of supporters. Tufekci points out

“people who seek political change, the networking that takes place among people with weak ties is especially important. People with strong ties already share similar views (…). Weaker ties may be far-flung and composed of people with varying political and social ties. Also, weak ties may create bridges to other clusters of people in a way strong ties do not.”

Protest movements predating social media often shared similarities with multi-day music festivals, overnight camps or even military training exercises. They instill a sense of camaraderie which attracts a certain type of indivudal. Today’s protest movements differ from those days in that they can erupt quickly, but fall apart as fast as they came to be. Still 

“many people are drawn to protest camps because of the alienation they feel in their ordinary lives as consumers. Exchanging products without money is like reverse commodity fetishism: for many, the point is not the product being exchanged but the relationship that is created.”

In addition the speed at which modern movements operate serves as an invitation for individuals disconnected from broader society or individuals who simply prefer the short-lived special operation to right a policy wrong over the long-term work required to build and maintain relationships that are powerful enough to organically drive a change of policy.

“Some online communities not only are distant from offline communities but also have little or no persistence or reputational impact. (…) Social scientists call this the “stranger-on-a-train” effect, describing the way people sometimes open up more to anonymous strangers than to the people they see around every day. (…) Such encounters can even be more authentic and liberating.”

Tufekci spends much time on describing the evolution of social interactions in a networked space, the social inertia that needs to be managed in order to pick up momentum, but she also offers some insights on defensive considerations to make a protest movement work. First and foremost, a protest movement garners attention online, which in turn creates an influx of supporters. It will also attract opposition from private individuals, political opponents, and current political leaders. Those in power had previously relied upon, and in some countries still rely upon, censorship and suppression of information. Twitter and other social media platforms have disrupted this control over the narrative:

“To be effective, censorship in the digital era requires a reframing of the goals of censorship not as a total denial of access, which is difficult to achieve, but as a denial of attention, focus, and credibility In the networked public sphere, the goal of the powerful often is not to convince people of the truth of a particular narrative or to block a particular piece of information from getting out, but to produce resignation, cynicism, and a sense of disempowerment among the people.”

I apologize for using a wealth of quotes from her book, but it’s best described there, in her own words. Protests movements are here to stay. Understanding how democratic nations evolve their policies, right political wrongs, and influence authoritarian nations through subtle policy, online protest and real-world tear gas confrontation will help us make more informed decisions as we pick our political battles. Zeynep Tufekci put together a well-researched account that helps to make sense of the most important, controversial online protest movements from the Occupy Gezi/Wall Street movements to the Eqyptian Revolution to the Arab Spring to Black Lives Matter and MeToo or the March For Our Lives. There are two noticeable drawbacks of this otherwise excellent book. First, the chapters appear uncoordinated within the book and are too long. The reader can’t take a breather without feeling to lose a thought. Second, her examples are chronologically disconnected from the actual movements. While this helps to illustrate a certain point, I found it to be a confusing feat. Twitter And Tear Gas has its own website. Check it out at https://www.twitterandteargas.org/ or reach out to the author on Twitter @zeynep 

Learn To Discern: How To Take Ownership Of Your News Diet

I am tired of keeping up with the news these days. The sheer volume of information is intimidating. It creates a challenge to filter relevant news from political noise only to then begin a process of analyzing the information for its integrity and accuracy. I certainly struggle to identify subtle misinformation when faced with it. That’s why I became interested in the psychological triggers weaved into the news to better understand my decision-making and conclusions. Pennycook and Rand wrote an excellent research paper on the human psychology of fake news.

tl;dr

We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are ‘better’ at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity. Furthermore, there is a substantial disconnect between what people believe and what they share on social media. This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms.


Make sure to read the full paper titled The Psychology of Fake News by Gordon Pennycook and David G. Rand at https://www.sciencedirect.com/science/article/pii/S1364661321000516

This recent research paper by psychologists of the University of Regina and the Sloan School of Management at Massachusetts Institute of Technology took a closer look at the sources of political polarization, hyperpartisan news, and the underlying psychology that influences our decision-making on whether news is accurate or misinformation. They answered the question of why people fall for misinformation on social media. Lessons that can be drawn from this research will be helpful to build effective tools to intercept and mitigate misinformation online. It will further advance our understanding of the underlying human psychology when interacting with information on social media. And while the topic could fill entire libraries, they limited their scope of research to individual examples of misinformation rather than organized, coordinated campaigns of inauthentic behavior excluding the spread of disinformation.

So, Why Do People Fall For Fake News?

There are two fundamental concepts that explain the psychological dynamics when faced with misinformation: truth discernment aims to establish a belief in the relative accuracy of news that is greater than known-to-be false information on the same event. Basically, this concept is rooted in active recognition and critical analysis of the information to capture people’s overall beliefs. Another concept that is used to explain why people fall for misinformation is the idea of truth acceptance. Thereunder the accuracy of news is not a factor but the overall belief of it. Instead of critical analysis of the information people chose to average or combine all available information, true or false, to establish an opinion about the veracity of the news that captures people’s overall belief. This commonly results in a biased perception of news. Other concepts related to this question look at motives. Political motivations can influence people’s willingness to reason based on their partisan, political identity. In other words, when faced with news that is consistent with their political beliefs, the information is regarded as true; when faced with news that is inconsistent with their political beliefs, the information is regarded as false. Loyalty to their political ideology can become so strong that it can override an apparent falsehood for the sake of party loyalty. Interestingly, the researchers found that political partisanship has much less weight than the actual veracity of news when assessing information. Misinformation that is in harmony with people’s political beliefs is less trustworthy than accurate information that is against people’s political beliefs. They also discovered that people tend to be better at analyzing information that is in harmony with our political beliefs, which helps to discern truth from falsehood. But if people hardly fall for misinformation consistent with our political beliefs, which characteristics make people fall for misinformation?

“People who are more reflective are less likely to believe false news content – and are better at discerning between truth and falsehood – regardless of whether the news is consistent or inconsistent with their partisanship”

Well, this brings us back to truth discernment. Belief in misinformation is commonly associated with overconfidence, lack of reflection, zealotry, delusionality, or overclaiming where an individual acts on completely fabricated information as a self-proclaimed expert. All of these factors indicate an inability of analytical thinking. On the opposite side of the spectrum, people determine the veracity of information through cognitive reflection and tapping into their relevant existing knowledge. This can be general political knowledge, a basic understanding of established scientific theories, or simple online media literacy.

“Thus, when it comes to the role of reasoning, it seems that people fail to discern truth from falsehood because they do not stop to reflect sufficiently on their prior knowledge (or have insufficient or inaccurate prior knowledge) – and not because their reasoning abilities are hijacked by political motivations.” 

The researchers found that the truth has little impact on sharing intentions. They describe three types of information-sharing on social media:

  • Confusion-based sharing: this concept encompasses a genuine belief in the veracity of the information-shared (even though the person is mistaken)
  • Preference-based sharing: this concept places political ideology, or related motives such as virtue signaling, above the truth of the information shared accepting misinformation as a collateral
  • Inattention-based sharing: thereunder people are only intending to share accurate information, but are distracted by the social media environment

Steps To Own What You Know

If prior knowledge is a critical factor to identify misinformation, then familiarity with accurate information goes a long way. An awareness of familiar information is critical to determine whether the information presented is the information that you already know or a slightly manipulated version. Be familiar with social media products. What does virality look like on platform XYZ? Is the uploader a verified actor? What is the source of the news? In general, sources are a critical signal to determine veracity. The more credible and established a source, the likelier the information is well-researched and accurate. Finally, a red flag for misinformation is emotional headlines, provocative captions, or shocking images.

Challenges To Identify Misinformation

Truth is not a binary metric. In order to determine the veracity of news, a piece of information may be falsified or laced with inaccuracies or compared against established, known information. Therefore the accuracy and precision, or overall quality, of a machine learning classifier for misinformation hinges on the clarity of the provided training data times the depth of exposure on the platform where the classifier will be deployed. Another challenge to consider is the almost ever-changing landscape of misinformation. Misinformation is rapidly evolving, convulsing into conspiracy theories and maybe (mistakenly) supported by established influencers and institutions. This creates problems to discern the elements of a news story, which undermines the chances to determine accuracy. Inoculation (deliberate exposure to misinformation to improve recognition abilities) is in part ineffective because people fail to stop, reflect and consider the accuracy of the information at all. Therefore successful interventions to minimize misinformation may start with efforts to slow down interactions on social media. This can be achieved by changing the user interface to introduce friction and prompts to help induce active reflection. Lastly, human fact-checking is not scalable. For so many reasons: time, accuracy, integrity, etc. Leveraging a community-based (crowd-sourced) fact-checking model might be an alternative until a more automated solution will be ready. Twitter has recently introduced experiments with these types of crowd-sourced products. Their platform is called Birdwatch.

This research paper didn’t unearth breakthrough findings or new material. It rather helped me to learn more about the dynamics of human psychology when exposed to a set of information. Looking at the individual concepts people use to determine the accuracy of information, the underlying motives that drive our attention, and the dynamics for when we decide to share news made this paper a worthwhile read. Its concluding remarks to improve the technical environment by leveraging technology to facilitate a more reflective, conscious experience of news on social media leaves me optimistic for better products to come.