An Ode To Diplomacy

There are few books that taught me more about the strategic decisions behind U.S. foreign policy than the Back Channel. Bill Burns’ account is both a history lesson and an upbeat reminder of the value of diplomacy.

The Back Channel by Bill Burns is a well-written, historic memoir of one of the finest career diplomats in the foreign service. Exceptionally clear-eyed, balanced, and insightful in both voice and content, Burns walks the reader through his three decades of foreign service. Starting out as the most junior officer of the U.S. embassy in Jordan under then Secretary of State George Shultz, Burns quickly made a name for himself in the Baker State Department through his consistency, ability to mediate and deliver, but also his foreign language skills including Arabic, French, English and Russian. In describing “events” at the State Department, Burns strikes a perfect balance between the intellectual depth of his strategic thinking against the contours of U.S. foreign policy. He offers a rare insight into the mechanics of diplomacy and the pursuit of American interests. For example, Burns illustrates the focus of the H.W. Bush administration in Libya was on changing behavior, not the Qaddafi regime. Sanctions and political isolation had already chastised Qaddafi’s sphere of influence, but American and British delegations supported by the weapons of mass destruction (WMD) interdiction in the Mediterranean were able to convince Qaddafi to give up the terrorism and WMD business. “He needed a way out, and we gave him a tough but defensible one. That’s ultimately what diplomacy is all about – not perfect solutions, but outcomes that cost far less than war and leave everyone better off than they would otherwise have been.”

I was too young to remember the German Reunification, but I vividly remember the Yeltsin era, its mismanaged economic policy, and the correlating demise of the Russian Ruble sending millions of Russians into poverty. When Al-Qaeda attacked the United States on September 11, I was glued to the news coverage for days, weeks on end – forever changing my worldviews and national identity. Burns memoir offers a new, liberating viewpoint for me about these events; it helped me to connect their impact on the foreign policy stage and subsequent decisions by world leaders. This really manifests in his description of Obama’s long-game in a post-primacy world:

“Statesmen rarely succeed if they don’t have a sense of strategy – a set of assumptions about the world they seek to navigate, clear purposes and priorities, means matched to ends, and the discipline required to hold all those pieces together and stay focused. They also, however, have to be endlessly adaptable – quick to adjust to the unexpected, massage the anxieties of allies and partners, maneuver past adversaries, and manage change rather than be paralyzed by it. (…) Playing the long game is essential, but it’s the short game – coping with stuff that happens unexpectedly – that preoccupies policymakers and often shapes their legacies.”

But aside from candid leadership lessons and rich history insights, what makes the Back Channel so captivating is the upbeat and fervent case for diplomacy. Burns goes out of his way detailing the daily grind that is required to serve and succeed in the State Department:

“As undersecretary, and then later as deputy secretary, I probably spent more time with my colleagues in the claustrophobic, windowless confines of the White House Situation Room than I did with anyone else, including my own family. (…) Our job was to propose, test, argue, and, when possible, settle policy debates and options, or tee them up for the decision of cabinet officials and the president. None of the president’s deputy national security advisors, however, lost sight of the human element of the process. (…) We were, after all, a collection of human beings, not an abstraction – always operating with incomplete information, despite the unceasing waves of open-source and classified intelligence washing over us; often trying to choose between bad and worse options.”

Moreover Burns offers lessons for aspiring career diplomats:

“Effective diplomats (also) embody many qualities, but at their heart is a crucial trinity: judgment, balance, and discipline. All three demand a nuanced grasp of history and culture, mastery of foreign languages, hard-nosed facility in negotiations, and the capacity to translate American interests in ways that other governments can see as consistent with their own – or at least in ways that drive home the cost of alternative courses. (…) What cannot be overstated, however, is the importance of sound judgment in a world of fallible and flawed humans – weighing ends and means, anticipating the unintended consequences of well-intentioned actions, and measuring the hard reality of limits against the potential of American agency.”

All taken together make the Back Channel a must-read of highest quality for anyone interested in U.S. foreign policy or diplomacy. I would even think the shrewd political observations captured in the Back Channel make for a valuable read with regard to domestic policy or current affairs, but a modicum of international policy awareness is still required. The Back Channel’s only drawback is its predominant focus on American interests in the Middle East and Europe. I can’t help but wonder how the United States would look like today had its political leadership opted for a strategy of offshore-balancing instead of a grand strategy of primacy; more focused on pressing domestic issues such as trade or immigration with our immediate neighbors Canada, Mexico and northern Latin America. I’m curious to hear Burns’ thoughts on this. Perhaps he’ll cover this arena after finishing his term as director of the Central Intelligence Agency.

Demystifying Foreign Election Interference

The Office of the Director of National Intelligence (ODNI) released a declassified report detailing efforts by foreign actors to influence and interfere in the 2020 U.S. presidential elections. The key finding of the report: Russia sought to undermine confidence in our democratic processes to support then President Donald J. Trump. Iran launched similar efforts but to diminish Trump’s chances of getting reelected. And China stayed out of it altogether.  

(Source: ODNI)

Make sure to read the full declassified report titled Intelligence Community Assessment of Foreign Threats to the 2020 U.S. Federal Elections releasedby the Office of the Director of National Intelligence at https://www.odni.gov/index.php/newsroom/reports-publications/reports-publications-2021/item/2192-intelligence-community-assessment-on-foreign-threats-to-the-2020-u-s-federal-elections

Background

On September 12, 2018 then President Donald J. Trump issued Executive Order 13848 to address foreign interference in U.S. elections. In essence, it authorizes an interagency review to determine whether an interference has occurred. In the event of foreign interference in a U.S. election the directive orders to create an impact report to trigger sanctions against (1) foreign individuals and (2) nation states. A comprehensive breakdown of the directive including the process of imposing sanctions can be found here. I will only focus on the findings of the interagency review laid out in the Intelligence Community Assessment (ICA) pursuant to EO 13848 (1)(a). The ICA is limited to intelligence reporting and other information available as of December 31, 2020.

Findings

The former President touted American voters before his own election in 2016, during his presidency and beyond the 2020 presidential elections with unsubstantiated claims of foreign election interference that would disadvantage his reelection chances. In Trump’s mind, China sought to undermine his chances to be reelected to office while he downplayed the role of Russia or Iran. The recently released ICA directly contradicts Trump’s claims. Here’s the summary per country:

Russia

  • Russia conducted influence operations targeting the integrity of the 2020 presidential elections authorized by Vladimir Putin
  • Russia supported then incumbent Donald J. Trump and aimed to undermine confidence in then candidate Joseph R. Biden
  • Russia attempted to exploit socio-political divisions through spreading polarized narratives without leveraging persistent cyber efforts against critical election infrastructure

The ICA finds a theme in Russian intelligence officials pushing misinformation about President Biden through U.S. media organizations, officials and prominent individuals. Such influence operations follow basic money laundering structures: (1) creation and dissemination of a false and misleading narrative, (2) conceal its source through layering in multiple media outlets involving independent (unaware) actors, and (3) integrating the damning narrative into the nation states official communication after the fact. A recurring theme was the false claim of corrupt ties between President Biden and Ukraine. These began spreading as early as 2014. 

Russian attempts to sow discord among the American people took place through narratives that amplified misinformation about the election process and its systems, e.g. undermining the integrity of mail-in ballots or highlighting technical failures and exceptions of misconduct. In a broader sense, topics around pandemic related lockdown measures or racial injustice or conservative censorship were exploited to polarize the affected groups. While these efforts required Russia’s cyber offensive units to take action, the actual evidence for a persistent cyber influence operation was not conclusive. The ICA categorized Russian actions as general intelligence gathering to inform Russian foreign policy rather than specifically targeting critical election infrastructure.

Iran

  • Iran conducted influence operations targeting the integrity of the 2020 presidential elections likely authorized by Supreme Leader Ali Khamenei
  • Unlike Russia, Iran did not support either candidate but aimed to undermine confidence in then incumbent Donald J. Trump
  • Iran did not interfere in the 2020 presidential elections as defined as activities targeting technical aspects of the election

The ICA finds Iran leveraged similar influence tactics as Russia targeting the integrity of the election process presumably in an effort to steer the public’s attention away from Iran and towards domestic issues around pandemic related lockdown measures or racial injustice or conservative censorship. However, Iran relied more notably on cyber-enabled offensive operations. These included aggressive spoofing emails disguised as to be sent from the Proud Boys group to intimidate liberal and left-leaning voters. Spear phishing emails sent to former and current officials aimed to gain impactful information and access to critical infrastructure. A high volume of inauthentic social media accounts was used to create divisive political narratives. Some of these accounts dated back to 2012.     

China

  • China did not conduct influence operations or efforts to interfere in the 2020 presidential elections

The ICA finds China did not actively interfere in the 2020 presidential elections. While the rationale in their assessment is largely based on political reasoning and foreign policy objectives, the report provides no data points for me to evaluate. The report does not offer insights into the role of Chinese Technology platforms repeatedly targeted by the former President. A minority view by the National Intelligence Office for Cyber (NIO) holds the opinion that China did deploy some cyber offensive operations to counter anti-Chinese policies. Former Director of National Intelligence John Ratcliffe leads this minority view expressed in a scathing memorandum that concludes the ICA fell short in their analysis with regard to China.

Recommendations

The ICA offers several insights into a long, strenuous election cycle. Its sober findings help to reformulate U.S. foreign policy and redefine domestic policy objectives. While this report is unable to detail all available intelligence and other information it offers some solace to shape future policies. For example:

  1. Cybersecurity – increased efforts to update critical election infrastructure has probably played a key role in the decreased efforts around cyber offensive operations. Government and private actors must continue to focus on cybersecurity, practise cyber hygiene and conduct digital audits to improve cyber education
  2. Media Literacy – increased efforts to educate the public about political processes. This includes private actors to educate their users about potential abuse on their platforms. Continuing programs to depolarize ideologically-charged groups through empathy and regulation is a cornerstone for a more perfect union

Additional and more detailed recommendations to improve the resilience of American elections and democratic processes can be found in the Joint Report of the Department of Justice and the Department of Homeland Security on Foreign Interference Targeting Election Infrastructure or Political Organization, Campaign, or Candidate Infrastructure Related to the 2020 U.S. Federal Elections

Threat Assessment: Chinese Technology Platforms

The American University Washington College of Law and the Hoover Institution at Stanford University created a working group to understand and assess the risks posed by Chinese technology companies in the United States. They propose a framework to better assess and evaluate these risks by focusing on the interconnectivity of threats posed by China to the US economy, national security and civil liberties.

tl;dr

The Trump administration took various steps to effectively ban TikTok, WeChat, and other Chinese-owned apps from operating in the United States, at least in their current forms. The primary justification for doing so was national security. Yet the presence of these apps and related internet platforms presents a range of risks not traditionally associated with national security, including data privacy, freedom of speech, and economic competitiveness, and potential responses raise multiple considerations. This report offers a framework for both assessing and responding to the challenges of Chinese-owned platforms operating in the United States.

Make sure to read the full report titled Chinese Technology Platforms Operating In The United States by Gary P. Corn, Jennifer Daskal, Jack Goldsmith, John C. Inglis, Paul Rosenzweig, Samm Sacks, Bruce Schneier, Alex Stamos, Vincent Stewart at https://www.hoover.org/research/chinese-technology-platforms-operating-united-states 

(Source: New America)

China has experienced consistent growth since opening its economy in the late 1970s. With its economy at about x14 today, this growth trajectory dwarfs the growth of the US economy, which increased at about x2 with the S&P 500 being its most rewarding driver at about x5 increase. Alongside economic power comes a thirst for global expansion far beyond the asian-pacific region. China’s foreign policy seeks to advance the Chinese one-party model of authoritarian capitalism that could pose a threat to human rights, democracy and the basic rule of law. US political leaders see these developments as a threat to their own US foreign policy of primacy but perhaps more important a threat to the western ideology deeply rooted in individual liberties. Needless to say that over the years every administration independent of political affiliation put the screws on China. A most recent example is the presidential executive order addressing the threat posed by social media video app TikTok. Given the authoritarian model of governance and the Chinese government’s sphere of control over Chinese companies their expansion into the US market raises concerns about access to critical data and data protection or cyber-enabled attacks on critical US infrastructure among a wide range of other threats to national security. For example:

Internet Governance: China is pursuing regulation to shift the internet from open to closed and decentralized to centralized control. The US government has failed to adequately engage international stakeholders in order to maintain an open internet but rather has authorized large data collection programs that emulate Chinese surveillance.

Privacy, Cybersecurity and National Security: The internet’s continued democratization encourages more social media and e-commerce platforms to integrate and connect features for users to enable multi-surface products. Mass data collection, weak product cybersecurity and the absence of broader data protection regulations can be exploited to collect data on domestic users, their behavior and their travel pattern abroad. It can be exploited to influence or control members of government agencies through targeted intelligence or espionage. Here the key consideration is aggregated data, which even in the absence of identifiable actors can be used to create viable intelligence. China has ramped up its offensive cyber operations beyond cyber-enabled trade or IP-theft and possesses the capabilities and cyber-weaponry to destabilize national security in the United States.

Necessity And Proportionality 

Considering mitigating the threat to national security by taking actions against Chinese owned- or controlled communications technology including tech products manufactured in China the working group suggests an individual case-based analysis. They attempt to address the challenge of accurately identifying the specific risk in an ever-changing digital environment with a framework of necessity and proportionality. Technology standards change at a breathtaking pace. Data processing reaches new levels of intimacy due to the use of artificial intelligence and machine learning. Thoroughly assessing, vetting and weighing a tolerance to specific risks are at the core of this framework in order to calibrate a chosen response to avoid potential collateral consequences.

The working group’s framework of necessity and proportionality reminded me of a classic lean six sigma structure with a strong focus on understanding the threat to national security. Naturally, as a first step they suggest accurately identifying the threat’s nature, credibility, imminence and the chances of the threat becoming a reality. I found this first step incredibly important because a failure to identify a threat will likely lead to false attribution and undermine every subsequent step. In the context of technology companies the obvious challenge is data collection, data integrity and detection systems to tell the difference. By that I imply a Chinese actor may deploy a cyber ruse in concert with the Chinese government to obfuscate their intentions. Following the principle of proportionality, step two is looking into the potential collateral consequence to the United States, its strategic partners and most importantly its citizens. Policymakers must be aware of the unintended path a policy decision may take once a powerful adversary like China starts its propaganda machine. Therefore this step requires policymakers to include thresholds for when a measure to mitigate a threat to national security outweighs the need to act. In particular inalienable rights such as the freedom of expression, freedom of the press or freedom of assembly must be upheld at all times as they are fundamental American values. To quote the immortal Molly IvinsMany a time freedom has been rolled back – and always for the same sorry reason: fear.” The third and final step concerns mitigation measures. In other words: what are we going to do about it? The working group landed on two critical factors: data and compliance. The former might be restricted, redirected or recoded to adhere to national security standards. The latter might be audited to not only identify vulnerabilities but further instill built-in cybersecurity and foster an amicable working-relationship. 

The Biden administration is faced with a daunting challenge to review and develop appropriate cyber policies that will address the growing threat from Chinese technology companies in a coherent manner that is consistent with American values. Only a broad policy response that is tailored to specific threats and focused on stronger cybersecurity and stronger data protection will yield equitable results. International alliances alongside increased collaboration to develop better privacy and cybersecurity measures will lead to success. However, the US must focus on their own strengths first, leverage their massive private sector to identify the specific product capabilities and therefore threats and attack vectors, before taking short-sighted, irreversible actions.

On Propaganda: Russia vs United States

The political and diplomatic relations between the United States and Russia have been in decline for the past decade. Geopolitical tensions between the two nations increased steadily leading to more and more political propaganda of their respective state media. This is also reflected in their government policy documents. These propaganda efforts resulted in a number of influence operations ranging from coordinated inauthentic behavior to create a false narrative to intentional spread of disinformation to undermine the political integrity of the other side. A recent article by researchers of the University of Sheffield and Bard College examined 135 journalistic pieces of American and Russian state media to better understand how their propaganda is portrayed in both countries. It’s an important contribution to better understand emerging public crisis, appropriate content policy response and future diplomacy.  

tl;dr

The period of growing tensions between the United States and Russia (2013–2019) saw mutual accusations of digital interference, disinformation, fake news, and propaganda, particularly following the Ukraine crisis and the 2016 US presidential election. This article asks how the United States and Russia represent each other’s and their own propaganda, its threat, and power over audiences. We examine these representations in US and Russian policy documents and online articles from public diplomacy media Radio Free Europe/Radio Liberty (RFE/RL) and RT. The way propaganda is framed, (de)legitimized, and securitized has important implications for public understanding of crises, policy responses, and future diplomacy. We demonstrate how propaganda threats have become a major part of the discourse about the US–Russia relationship in recent years, prioritizing state-centred responses and disempowering audiences.

Make sure to read the full article titled Competing propagandas: How the United States and Russia represent mutual propaganda activities by Dmitry Chernobrov and Emma L. Briant at https://journals.sagepub.com/doi/full/10.1177/0263395720966171

Credit: https://econ.st/39hEh05 & https://bit.ly/2XuWm5i

How does the United States influence its own citizens by the ways in which it represents the propaganda efforts of Russia at home? How is American propaganda portrayed in Russia? Contrary to popular belief the United States is actively conducting influence operations to disseminate propaganda in foreign countries and at home. Under the U.S. Information and Educational Exchange Act of 1948, known as Smith-Mundt Act and it’s 2012 modernization amendments, the U.S. government is free to extend propaganda efforts to public broadcasters and radio stations foreign and domestic. In Russia, the situation is quite different: state-owned media, strategic use of broadcasting and information technologies are a central feature of the current government. Recent legislation aimed to pressure opposition and restrict freedom of speech and assembly are only surface examples of Russia’s soft power approach in foreign and domestic policy. President Putin defined soft power as “promoting one’s interests and policies through persuasion”, which has translated into Russian public diplomacy initiatives that use a combination of international broadcasting and web-based social networks to engage foreign publics.

Russia’s constitution declares Russia to be a democratic state with a republican form of government and state power divided between the legislative, executive and judicial branches. However, the policy changes introduced by Vladimir Putin have effectively turned Russia’s political system into a particular type of post-totalitarian authoritarianism. Similarly, the United States political system is a federal constitutional republic of checks and balances between the executive, the legislative and the judicial branch. While the United States and Russia differ greatly with regard to their political systems, their respective mechanisms, construction and response to propaganda afford valuable insights for communication researchers. 

To start, it is important to understand propaganda and its variation in public diplomacy. The researchers elaborate on the intricacies of finding the appropriate terminology but suggest propaganda to be 

“a process by which an idea or an opinion is communicated to someone else for a specific persuasive purpose”

Public diplomacy reaches a step further by extending a state’s policy objective to an international audience through intercultural exchanges, advocacy and international broadcasting. The researchers examined 90 articles by Radio Free Europe/Radio Liberty and 45 articles by Russia Today through a critical discourse analysis in an effort to compare and differentiate between moral representation, reporting on government actions, actors and victims. They then parsed it with threats to national security and national identity. They found that both the United States and Russia view and portray propaganda as an external threat orchestrated by a foreign actor using conflict-related, binary language with no room for compromise. Both countries’ propaganda language widely used scientific and technology metaphors to create an impression of sophistication beyond the comprehension of average Joe. Mere exposure to this type of propaganda is assumed to be enough to rally citizens; actual persuasion was not apparently an objective for either U.S. or Russian state media.

While the United States understands propaganda as a foreign threat against the national security of the West, Russian documents use foreign to clarify the externality of influenced communications or misinformation without specific location labelling in the context of propaganda. The United States often portrays itself as ‘leader of the free world’ with the oldest free democracy whereas Russia is depicted as a morally isolated, neo-soviet autocracy. Russia often diametrically portrays the United States by its flawed political system redirecting the Russian audience to the shortcomings of American democracy and conflicting political leaders. Both the United States and Russia construct propaganda in similar ways using similar elements: (1) they present a national security threat to the state and its international reputation while (2) they reframe domestic political problems as foreign-induced, which then justifies a strong, determined state response. This fear-driven approach to portray propaganda as something that can only be mitigated by a strong government response tends to disenfranchise citizens, induces chilling effects that lead to censorship and undermine civic engagement.

As both the United States and Russia will likely continue to lace state media with its propaganda citizens can learn to be vigilant when interacting with content online, in particular if the overall messaging of the content presents itself in a binary fashion. To counter disinformation, policymakers must better communicate policy solutions and focus on media literacy and education. Lastly, government officials can contribute to decrease propaganda and polarization by reframing their political narrative through an infinite mindset with choices and compassion.

Is Transparency Really Reducing The Impact Of Misinformation?

A recent study investigated YouTube’s efforts to provide more transparency about the ownership of certain YouTube channels. The study concerned YouTube’s disclaimers displayed under the video that indicate the content was produced or is funded by a state-controlled media outlet. The study sought to shed light on whether or not these disclaimers are an efficient means to reduce the impact of misinformation.  

tl;dr

In order to test the efficacy of YouTube’s disclaimers, we ran two experiments presenting participants with one of four videos: A non-political control, an RT video without a disclaimer, an RT video with the real disclaimer, or the RT video with a custom implementation of the disclaimer superimposed onto the video frame. The first study, conducted in April 2020 (n = 580) used an RT video containing misinformation about Russian interference in the 2016 election. The second conducted in July 2020 (n = 1,275) used an RT video containing misinformation about Russian interference in the 2020 election. Our results show that misinformation in RT videos has some ability to influence the opinions and perceptions of viewers. Further, we find YouTube’s funding labels have the ability to mitigate the effects of misinformation, but only when they are noticed, and the information absorbed by the participants. The findings suggest that platforms should focus on providing increased transparency to users where misinformation is being spread. If users are informed, they can overcome the potential effects of misinformation. At the same time, our findings suggest platforms need to be intentional in how warning labels are implemented to avoid subtlety that may cause users to miss them.

Make sure to read the full article titled State media warning labels can counteract the effects of foreign misinformation by Jack Nassetta and Kimberly Gross at https://misinforeview.hks.harvard.edu/article/state-media-warning-labels-can-counteract-the-effects-of-foreign-misinformation/

Source: RT, 2020 US elections, Russia to blame for everything… again, Last accessed on Dec 31, 2020 at https://youtu.be/2qWANJ40V34?t=164

State-controlled media outlets are increasingly used for foreign interference in civic events. While independent media outlets can be categorized on social media and associated with a political ideology, a state-controlled media outlet generally appears independent or detached from a state-controlled political agenda. Yet they regularly create content concomitant with the controlling state’s political objectives and its leaders. This deceives the public about its state-affiliation and undermines civil liberties. The problem is magnified on social media platforms with their reach and potential for virality ahead of political elections. A prominent example is China’s foreign interference efforts in the referendum on the independence of Hong Kong.

An increasing number of social media platforms launched integrity measures to increase content transparency to counter the integrity risks associated with a state-controlled media outlet proliferating potential disinformation content. In 2018 YouTube began to roll out an information panel feature to provide additional context on state-controlled and publicly funded media outlets. These information panels or disclaimers are really warning labels that make the viewer aware about the potential political influence of a government on the information shown in the video. These warning labels don’t provide any additional context on the veracity of the content or whether the content was fact-checked. On desktop, they appear alongside a hyperlink leading to the wikipedia entry of the media outlet. As of this writing the feature applies to 27 governments including the United States government. 

Source: DW News, Massive explosion in Beirut, Last Accessed on Dec 31, 2020 at https://youtu.be/PLOwKTY81y4?t=7

The researchers focused on whether these warning labels would mitigate the effects on viewers’ perception created by misinformation shown in videos of the Russian state-controlled media outlet RT (Russia Today). RT evades deplatforming by complying with YouTube’s terms of service. This turned the RT channel into an influential resource for the Russian government to undermine confidence of the American public to trust established American media outlets and the United States government when reporting on the Russian interference in the 2016 U.S. presidential elections. An RT video downplaying the Russian influence operation was used for the study and shown to participants with and without a label identifying RT’s affiliation with the Russian government as well as a superimposed warning label with the same language and hyperlink to wikipedia. This surfaced the following findings: 

  1. Disinformation spread by RT does impact viewer’s perception and is effective at that.
  2. Videos without a warning label were more successful in reducing trust in established mainstream media and the government
  3. Videos without a warning label but a superimposed interstitial with the language of the warning label were most effective in preserving the integrity of viewer’ perceptions
Source: RT, $4,700 worth of ‘meddling’: Google questioned over ‘Russian interference’, Last accessed on Dec 31, 2020 at https://www.youtube.com/watch?v=wTCSbw3W4EI

The researchers further discovered small changes in coloring, design and placement of the warning label increase the viewer taking notice of it and it helps with absorbing the information. Both conditions must be met because noticing a label without comprehending its message had no significant impact on understanding the political connection of creator and content. 

I’m intrigued by these findings for the road ahead offers a great opportunity to shape how we distribute and consume information on social media without falling prey for foreign influence operations. Though open questions remain: 

  1. Are these warning labels equally effective on other social media platforms, e.g. Facebook, Instagram, Twitter, Reddit, TikTok, etc.? 
  2. Are these warning labels equally effective with other state-controlled media? This study focused on Russia, a large, globally acknowledged state actor. How does a warning label for content by the government of Venezuela or Australia impact the efficacy of misinformation? 
  3. This study seemed to be focused on the desktop version of YouTube. Are these findings transferable to the mobile version of YouTube?  
  4. What is the impact of peripheral content on viewer’s perception, e.g. YouTube’s recommendation showing videos in its sidebar that all claim RT is a hoax versus videos that all give RT independent credibility?
  5. The YouTube channels of C-SPAN and NPR did not appear to display a warning label within their videos. Yet the United States is among the 27 countries currently listed in YouTube’s policy. What are the criteria to be considered a publisher, publicly funded or state-controlled? How are these criteria met or impacted by a government, e.g. passing certain broadcasting legislation or declaration?
  6. Lastly, the cultural and intellectual background of the target audience is particularly interesting. Here is an opportunity to research the impact of warning labels with participants of different political ideologies, economic circumstances and age-groups in contrast to the actual civic engagement ahead, during and after an election