Political Warfare Is A Threat To Democracy. And Free Speech Enables It

“I disapprove of what you say, but I will defend to the death your right to say it” is an interpretation of Voltaire’s principles by Evelyn Beatrice Hall. Freedom of expression is often cited as the last frontier before falling into authoritarian rule. But is free speech, our greatest strength, really our greatest weakness? Hostile authoritarian actors seem to exploit these individual liberties by engaging in layered political warfare to undermine trust in our democratic systems. These often clandestine operations pose an existential threat to our democracy.   

tl;dr

The digital age has permanently changed the way states conduct political warfare—necessitating a rebalancing of security priorities in democracies. The utilisation of cyberspace by state and non- state actors to subvert democratic elections, encourage the proliferation of violence and challenge the sovereignty and values of democratic states is having a highly destabilising effect. Successful political warfare campaigns also cause voters to question the results of democratic elections and whether special interests or foreign powers have been the decisive factor in a given outcome. This is highly damaging for the political legitimacy of democracies, which depend upon voters being able to trust in electoral processes and outcomes free from malign influence— perceived or otherwise. The values of individual freedom and political expression practised within democratic states challenges their ability to respond to political warfare. The continued failure of governments to understand this has undermined their ability to combat this emerging threat. The challenges that this new digitally enabled political warfare poses to democracies is set to rise with developments in machine learning and the emergence of digital tools such as ‘deep fakes’.

Make sure to read the full paper titled Political warfare in the digital age: cyber subversion, information operations and ‘deep fakes’ by Thomas Paterson and Lauren Hanley at https://www.tandfonline.com/doi/abs/10.1080/10357718.2020.1734772

MC2 Joseph Millar | Credit: U.S. Navy

This paper’s central theme is at the intersection of democratic integrity and political subversion operations. The authors describe an increase of cyber-enabled espionage and political warfare due to the global spread of the internet. They argue it has led to an imbalance between authoritarian and democratic state actors. Their argument rests on the notion that individual liberties such as freedom of expression put democratic states at a disadvantage compared to authoritarian states. Therefore authoritarian states are observed to more often choose political warfare and subversion operations versus democracies are confined to breaching cyber security and conducting cyber espionage. Cyber espionage is defined as

“the use of computer networks to gain illicit access to confidential information, typically that held by a government or other organization”

and is not a new concept. I disagree with the premise of illicit access because cyberspace specifically enables the free flow of information beyond any local regulation. Illicit is either redundant for espionage does not necessarily require breaking laws, rules or customs or it is duplicative with confidential information, which I interpret as synonymous with classified information. Though one might argue about the difference. From a legal perspective, the information does not need to be obtained through illicit access.

With regard to the broader term political warfare, I found the definition of political warfare as, 

“diverse operations to influence, persuade, and coerce nation states, organizations, and individuals to operate in accord with one’s strategic interests without employing kinetic force” 

most appropriate. It demonstrates the depth of political warfare, which encompasses influence and subversion operations outside of physical activity. Subversion operations are defined as 

“a subcategory of political warfare that aims to undermine institutional as well as individual legitimacy and authority”

I disagree with this definition for it fails to emphasize the difference between political warfare and subversion – both undermine legitimacy and authority. However, a subversion operation is specifically aimed to erode and deconstruct a political mandate. It is the logical next step after political warfare influenced a populace in order to achieve political power. The authors see the act of subversion culminating in a loss of trust in democratic principles. It leads to voter suppression, reduced voter participation, decreased and asymmetrical review of electoral laws but more importantly it poses a challenge to the democratic values of its citizens. It is an existential threat to a democracy. It favors authoritarian states detached from checks and balances that are usually present in democratic systems. These actors are not limited by law or civic popularity or reputational capital. Ironically, this bestows a certain amount of freedom upon them to deploy political warfare operations. Democracies on the other hand uphold individual liberties such as freedom of expression, freedom of the press, freedom of assembly or equal treatment under law and due process. As demonstrated during the 2016 U.S. presidential elections, a democracy generally struggles with identifying political warfare initiated by a foreign (hostile) state from certain segments of the population pursuing their strategic objectives by leveraging these exact individual freedoms. An example from the Mueller Report 

“stated that the Internet Research Agency (IRA), which had clear links to the Russian Government, used social media accounts and interest groups to sow discord in the US political system through what it termed ‘information warfare’ […] The IRA’s operation included the purchase of political advertisements on social media in the names of US persons and entities, as well as the staging of political rallies inside the United States.”

And it doesn’t stop in America. Russia is deploying influence operations in volatile regions on the African continent. China has a history of attempting to undermine democratic efforts in Africa. Both states aim to chip away power from former colonial powers such as France or at least suppress efforts to democratise regions in Africa. China is also deeply engaged in large-scale political warfare in the Southeast Asian region over regional dominance but also territorial expansion as observed in the South China Sea. New Zealand and Australia recorded numerous incidents of China’s attempted influence operations. Australia faced a real-world political crisis when Australian Labor Senator Sam Dastyari was found to be connected to political donor Huang Xiangmo, who has ties to the Chinese Communist Party. Therefore, China having a direct in-route to influence Australian policy decisions. 

The paper concludes with an overview of future challenges posed by political warfare. With more and more computing power readily available the development of new cyber tools and tactics to ideate political warfare operations is only going to increase. Authoritarian states are likely to expand their disinformation playbooks by tapping into the fears of people fueled by conspiracy theories. Developments of machine learning and artificial intelligence will further improvements of inauthentic behavior online. For example, partisan political bots will become more human and harder to discern from real human users. Deep fake technology will increase sampling rates by tapping into larger datasets from the social graph of every human being making it increasingly possible to impersonate individuals to gain access or achieve certain strategic objectives. Altogether, political warfare poses a greater challenge than cyber-enabled espionage in particular for democracies. Democracies need to understand the asymmetrical relationship with authoritarian actors and dedicate resources to effective countermeasures to political warfare without undoing civil liberties in the process.

Advertisement

Essential Politics: Watch This Ahead Of The 2020 Election

I put together a list of my favorite political documentaries. The focus is on the upcoming election. It’s not a complete list, it never will be. It’s taking pulse ahead of the conclusion of a polarized campaign year. Maybe it will help you to unwind before you cast your vote (if you haven’t already)!

tl;dr

Make sure to vote on November 3rd. Also make sure to watch The Choice 2020 and Dark Money. Both documentaries offer clear, unfiltered insights into the minds of Joe Biden and Donald Trump. Both documentaries are designed to critically reflect on the impact your vote has on the health of our democracy. Lastly, I recommend taking a look at The Circus too. It’s an entertaining yet jaw-dropping show about the inner workings of campaigns. It’s a reminder that as long as not everybody can afford to run for office, our democracy needs to improve. 

Get Me Roger Stone (2017)

This 1 ½ hour long documentary film details the rise and fall of Republican political operative, former Nixon-aide, long-time friend of Donald Trump and convicted felon Roger Stone. The life of Roger Stone is marked by political victories without ever becoming successful in politics. A long-time political fixture in Republican politics, Roger Stone was never accepted by the Republican establishment. Stone is the central piece in this film showing his eccentric character in interviews about his journey from Richard Nixon to Ronald Reagan to Donald Trump. While the stories told are intriguing in nature, the film allocates too much credit to a marginalized political strategist, who merely deployed shock and awe techniques to stun his opposition without much long-term strategy at play.

Watch on Netflix: https://www.netflix.com/title/80114666

Dark Money (2018)

The Supreme Court decision in Citizens United v. FEC was a landmark ruling to pave the way for political interest groups influencing U.S. elections. Dark Money is a 1 ½ hour long documentary film that follows John S. Adams of the Montana Free Press on his journey to investigate how corporate money is undermining American politics. It is a harrowing contemporary film that should rattle all Americans for it details the existential threat to local journalism in being the last check to properly balance corporate exploitation. This shocking story of intentional deceit of the American voters contributes to a larger debate the U.S. electorate must face about how their democracy is eroded when elections are bought and sold.   

More infos: https://www.darkmoneyfilm.com

Knock Down The House (2019)

This 1 ½ hour long documentary film follows four female democratic candidates in their 2018 midterm election campaigns for the U.S. House of Representatives. The candidates are Alexandria Ocasio-Cortez (New York), Amy Vilela (Nevada), Cori Bush (Missouri), Paula Jean Swearengin (West Virginia) each facing a long-term democratic incumbent in their respective campaigns that hasn’t been challenged in decades. The film is a gripping story about the absurdity of election campaigning in American politics. But more importantly, it tells the stories of four women, who not only face a long-term incumbent with years of political experience and a large donor base, but the challenge of overcoming the stigma that a woman can be a leader in an executive role. With the documentary following four candidates, I found too much screen time focusing on Alexandria Ocasio-Cortez, who has a polarizing nature but really is only one fourth of this larger, systemic issue. Nevertheless, this documentary is a must-see to get an understanding of the critical failures in diversity and equality that America has turned a blind on.

More infos: https://knockdownthehouse.com 

Hillary (2020)

Across four episodes, this 4 ½ hour long documentary series about Hillary Clinton’s unsuccessful 2016 presidential campaign is chronicling the roller coaster emotions of running a national campaign. It reveals an unseen personal side of Hillary Clinton by looking at milestones in her career and personal life that shaped her thinking as a woman, mother and politician. Archival footage is seamlessly interlaced with contemporary interviews. This documentary delivers a perfect balance of Hillary Clinton’s extraordinary intellectual breadth and impact on foreign and domestic political affairs while keeping it captivating and worth your while.    

Watch on hulu: https://www.hulu.com/series/hillary-793891ec-5bb7-4200-ba93-e3629532d670

The Circus: Inside The Craziest Campaign On Earth, Season 5 (2020)

I am not sure if The Circus needs any introduction. Running for five seasons with currently 84 individual episodes each 30 minutes and longer, this epic TV documentary series is a real-time political insider show you simply cannot miss out on. In season 5, the show begins in the aftermath of the impeachment trial against Donald Trump quickly transitioning to the first primary elections for the 2020 U.S. presidential elections in Iowa, New Hampshire and Nevada. Then COVID-19 hit the country. The restrictions and nationwide lockdown measures changed this election like it has never seen before in history. After episode 8 in March 2020, the show stops due to the lockdown measures only to resume “The New Abnormal” with episode 9 in August 2020. The long break changed everything yet it didn’t change the tension and excitement that is a run for the highest office in the land of the free. The hosts of the Circus John Heilemann, Mark McKinnon, and Alex Wagner are doing a stellar job in observing the candidates’ campaigns and asking the hard questions while reflecting on the bizarre experience it is to run for president during a pandemic.

Watch on Showtime: https://www.sho.com/the-circus-inside-the-greatest-political-show-on-earth/season/5 

The Swamp (2020)

This 1 ½ hour long documentary film follows three Republican Congressman Matt Gaetz (Florida), Thomas Massie (Kentucky), and Ken Buck (Colorado) in their efforts to leave a mark on the American political landscape. It details the need for fundraising and donations that are at liberty of interest groups and lobbyists. It is another concerning picture of American politics chained to corporate interests. The documentary is hosted by Harvard Law professor Lawrence Lessig, who founded “Rootstrikers” to reduce the corrosive influence of corporate money in the American democracy. 

Watch on HBO: https://www.hbo.com/documentaries/the-swamp 

The Choice 2020: Trump vs. Biden (2020)

The name is the game of this 2 hour long documentary film by Frontline. Ahead of the most polarized elections in U.S. history, the film focuses on interviews with family, friends and foes of both candidates Joe Biden and Donald Trump. It chronicles how both Biden and Trump handled themselves during times of crisis and how they responded to adversity. It’s about who these men are that are asking for your vote. 

Watch on PBS: https://www.pbs.org/wgbh/frontline/film/the-choice-2020-trump-vs-biden/

+++++++++++++++++++++++++++++++

Make sure to vote on November 3rd


2020-11-03T18:00:00

  days

  hours  minutes  seconds

until

2020 U.S. Presidential Election

More infos here: https://www.usa.gov/voting

How Political Bots Worsen Polarization

Do you always know who you are dealing with? Probably not. Do you always recognize when you are influenced? Unlikely. I found it hard to pick up on human signals without succumbing to my own predisposed biases. In other words maintaining “an open mindset” is easier said than done. A recent study found this to be true in particular for dealing with political bots.

tl;dr

Political bots are social media algorithms that impersonate political actors and interact with other users, aiming to influence public opinion. This research investigates the ability to differentiate bots with partisan personas from humans on Twitter. This online experiment (N = 656) explores how various characteristics of the participants and of the stimulus profiles bias recognition accuracy. The analysis reveals asymmetrical partisan-motivated reasoning, in that conservative profiles appear to be more confusing and Republican participants perform less well in the recognition task. Moreover, Republican users are more likely to confuse conservative bots with humans, whereas Democratic users are more likely to confuse conservative human users with bots. The research discusses implications for how partisan identities affect motivated reasoning and how political bots exacerbate political polarization.

Make sure to read the full paper titled Asymmetrical Perceptions of Partisan Political Bots by Harry Yaojun Yan, Kai-Cheng Yang, Filippo Menczer, James Shanahan at https://journals.sagepub.com/doi/full/10.1177/1461444820942744 

Illustration by C. R. Sasikumar

The modern democratic process is technological information warfare. Voters need to be enticed to engage with information about a candidate and election campaigns need to ensure accurate information is presented to build and expand an audience, or a voter base. Assurances for the integrity of information do not exist. And campaigns are incentivised to undercut the opponent’s narrative while amplifying its own candidates message. Advertisements are a potent weapon in any election campaign. Ad spending  on social media for the 2020 U.S. presidential election between Donald Trump and Joe Biden is already at a high with a projected, total bill of $10.8 billion driven by both campaigns. Grassroots campaigns are another potent weapon to decentralize a campaign, mobilize local leaders and impact a particular (untapped) electorate. While the impact of the coronoavirus on grassroots efficacy is yet to be determined, these campaigns are critical to solicit game-changing votes.

Which brings me to the central theme of this post and the paper: bots. When ad dollars or a human movement are out of reach bots are the cheap, fast and impactful alternative. Bots are algorithms to produce a programmed result automatically or human induced with the objective to copy and create the impression of human behavior. We have all seen or interacted with bots on social media after reaching out to customer service. We have all heard or received messages from bots trying to set us up with “the chance of a lifetime”. But do we always know when we’re interacting with bots? Are there ways of telling an algorithm apart from a human?

Researchers from Indiana University of Bloomington took on these important questions in their paper titled Asymmetrical Perceptions of Partisan Political Bots. It explains the psychological factors that impact our perception and decisioning when interacting with partisan political bots on social media. Political bots are used to impersonate political actors and interact with other users in an effort to influence public opinion. Bots have been known to facilitate the spread of spam mail and fake news. They have been used and abused to amplify conspiracy theories. Usage leads to improvement. In conjunction with enhanced algorithms and coding this poses three problems: 

(1) social media users become vulnerable to misreading a bot’s actions as human. 
(2) A partisan message, campaign success or sensitive information can be scaled up through networking effects and coordination of automation. A certainly frightening example would be the use of bots to declare to have won the election while voters are still casting their ballot. And
(3) political bots are by virtue partisan. A highly polarized media landscape, offers fertile ground for political bots to exploit biases and overcome political misconceptions. That means becoming vulnerable isn’t really necessary; a mere exposure to a partisan political bot can lay the groundwork for later manipulation or influence of opinion.

Examples of low-ambiguity liberal (left), low-ambiguity conservative (right) profiles used as stimuli. Identifiable information is blurred.

The research is focused on whether certain individuals or groups are easier to be influenced by partisan political bots than others. This recognition task depends on how skillful certain individuals or groups can detect a partisan narrative, recognize their own partisan bias and either navigate through motivated reasoning or drown in it. Motivated reasoning can be seen as in-group favoritism and out-group hostility, i.e. conservatives favor Republicans and displease democrats. Contemporary detection methods include (1) network-based, i.e. bots are presumed to be inter-connected – detecting one exposes connections to other bots. (2) Crowdsourcing, i.e. engaging experts in the manual detection of bots. And (3) feature-based, i.e. a supervised machine-learning classifier is trained with categorization statistics of political accounts and is constantly matching against inauthentic metrics. These methods can be combined to increase detection rates. At this interesting point in history, it is an arms race between writing code for better bots against building systems to better identify novel algorithms at scale. This arms race, however, is severely detrimental for democratic processes as they are potent enough to deter or at least undermine confidence of participants at the opposing end of the political spectrum.

Examples high-ambiguity liberal (left), and high-ambiguity conservative (right) profiles used as stimuli. Identifiable information is blurred.

The researchers found that knowingly interacting with partisan political bots only magnifies polarization eroding trust in the opposing party’s intentions. However, a regular user will presumably struggle to discern a political bot from a politically motivated (real) user. It leaves the potential voter base vulnerable to automated manipulation. To overcome this manipulation, the researchers focused on identifying the factors that make up the human perception when it comes to ambiguity between real user and political bot as well as recognition of the coded partisanship of the bot. Active users of social media were more likely to establish a mutual following with a political bot. These users happened to be more conservative while the political bots they chose to interact were likely conservative too. Time played a role insofar as active users who took more time to investigate and understand the political bot, which they only saw as a regular-looking, partisan social media account, were less likely to accurately discern real user from political bot. In their results, this demonstrated (1) a higher chance for conservative users to be deceived by conservative political bots and (2) a higher chance for liberal users to misidentify conservative (real) users for political bots. The researchers conclude that 

users have a moderate capability to identify political bots, but such a capability is also limited due to cognitive biases. In particular, bots with explicit political personas activate the partisan bias of users. ML algorithms to detect social bots provide one of the main countermeasures to malicious manipulation of social media. While adopting third-party bot detection methods is still advisable, our findings also suggest possible bias in human-annotated data used for training these ML models. This also calls for careful consideration of algorithmic bias in future development of artificial intelligence tools for political bot detection.

I have been intrigued with these findings. Humans tend to struggle to establish trust online. It’s surprising and concerning that conservative bots may be perceived as conservative humans by Republicans and conservative humans may be perceived as bots by Democrats. The potential to sow distrust to polarize public opinion is nearly limitless for a motivated interest group. While policymakers and technology companies are beginning to address these issues with targeted legislation, it will take a concerted multi-stakeholder approach to mitigated and reverse polarization spread by political bots.

Why Are We Sharing Political Misinformation?

Democracy is built upon making informed decisions by the rule of the majority. As a society, we can’t make informed decisions if the majority is confused by fake news in the shape of false information distributed and labeled as real news. It has the potential to erode trust in democratic institutions, stir up social conflict and facilitate voter suppression. This paper by researchers from New York and Cambridge University examines the psychological drivers of sharing political misinformation and is providing solutions to reduce the proliferation of misinformation online. 

tl;dr

The spread of misinformation, including “fake news,” disinformation, and conspiracy theories, represents a serious threat to society, as it has the potential to alter beliefs, behavior, and policy. Research is beginning to disentangle how and why misinformation is spread and identify processes that contribute to this social problem. This paper reviews the social and political psychology that underlies the dissemination of misinformation and highlights strategies that might be effective in mitigating this problem. However, the spread of misinformation is also a rapidly growing and evolving problem; thus, scholars also need to identify and test novel solutions, and simultaneously work with policy makers to evaluate and deploy these solutions. Hence, this paper provides a roadmap for future research to identify where scholars should invest their energy in order to have the greatest overall impact.

Make sure to read the full paper titled Political psychology in the digital (mis)information age by Jay J. Van Bavel, Elizabeth Harris, Philip Pärnamets, Steve Rathje, Kimberly C. Doell, Joshua A. Tucker at https://psyarxiv.com/u5yts/ 

It’s no surprise that misinformation spreads significantly faster than the truth. The illusory truth effect describes this phenomenon as misinformation that people had heard before were more likely to be believed. We all have heard of a juicy rumor in the office before learning it is remotely true or made up altogether. Political misinformation takes the dissemination rate to the next level. It has far greater rates of sharing due to its polarizing nature driven by partisan beliefs and personal values. Even simple measures seemingly beneficial to all of society are faced with an onslaught of misinformation. For example, California proposition 15 designed to close corporate tax loopholes was opposed by conservative groups resorting to spread misinformation about the reach of the law. They conflated corporations with individuals making it a family affair to solicit an emotional response from the electorate. It’s a prime example for a dangerous cycle in which political positions are the drivers of misinformation which in turn is facilitating political division and obstructing the truth to make informed decisions. Misinformation is found to be shared more willingly, quicker and despite contradicting facts if the misinformation was in line with the political identity and seeking to derogate the opposition. In the example above, misinformation about proposition 15 was largely shared if it (a) contained information in line with partisan beliefs and (b) it sought to undercut the opponents of the measure. As described in the paper, the more polarized a topic is (e.g. climate change, immigration, pandemic response, taxation of the rich, police brutality etc.) the more likely misinformation will be shared by its individual political in-groups to be used against their political out-groups without further review of its factual truth. This predisposed ‘need for chaos’ is hard to mitigate because the feeling of being marginalized is a complex, societal problem that no one administration can resolve. Further, political misinformation tends to be novel and trigger more extreme emotions of fear and disgust. It tends to confuse the idea of being better off is equal to being better than another political out-group. 

Potential solutions to limit the spread of political misinformation can already be observed across social media:

  1. Third-Party Fact Checking, is the second review by a dedicated, independent fact-checker committed to neutrality in reporting information. Fact-checking does reduce belief in misinformation but is less effective for political misinformation. Ideological commitments and exposure to partisan information foster a different reality that, in rare extreme cases, can create scepticism of fact-checks leading to an increased sharing of political misinformation, the so-called backfire effect.  
  2. Investing in media literacy to drive efforts of ‘pre-bunking’ false information before they gain traction including to offer tips or engage in critical reflection of certain information is likely to produce optimal long-term results. Though it might be problematic to implement effectively for political information as media literacy is dependent on the provider and bi-partisan efforts are likely to be opposed by their respective extreme counterparts. 
  3. Disincentivizing viral content by changing the monetization structure to a blend of views, ratings and civic benefit would be a potent deterrent for creating and sharing political misinformation. However, this measure would likely conflict with growth objectives of social media platforms in a shareholder-centric economy.

This paper is an important contribution to the current landscape of behavioral psychology. Future research will need to focus on developing a more comprehensive theory of why we believe and share political misinformation but also how political psychology correlates with incentives to create political misinformation. It will be interesting to learn how to manipulate the underlying psychology to alter the lifecycle of political information on different platforms, in different mediums and through new channels.

The Influence of Technology on Presidential Primary Campaigns

Without social media, there would not be a President Trump. We all felt the Bern in 2016. And let’s not forget “pizzagate” or “Yes we can”. The power of technology has undeniably impacted elections for political office, but how does it influence voters’ decisions on election day? Is social media the lone culprit undermining the integrity of our democracy or does history offer insights of sobering nature? These and other questions are subject to analysis by Anthony J. Gaughan with Drake University. Here’s a rundown of his paper “The Influence of Technology on Presidential Primary Campaigns

The paper examines technological innovations impacting Presidential primary campaigns. Supreme Court decisions Buckley v. Valeo or Citizens United v. FEC appear to have paved the regulatory playing field towards unrestricted campaign spending. Contrary to popular belief, Presidential primary campaigns are not skewed to the wealthiest of candidates. They simply favor the Presidential candidate who is most savvy of current technology to leverage his audience for the benefit of the campaign.   

Gaughan starts his analysis as early as 1912 when Presidential primary campaigns relied on the rail network to expose the candidate to crucial constituents. The radio would bring about change by offering a low cost, easy and broad access medium available to the general public. It would demonstrate that a candidate with an ability to make his audience feel they know him like nobody else could overcome other limitations of his persona. By the early 1950s, television would enter the scene to influence voters. The Presidential candidate John F. Kennedy overcame incredible odds of winning the Protestant state of West Virginia despite being of Catholic belief by leveraging TV ads displaying himself as a handsome, professional leader who “would not take orders from any Pope, Cardinal, Bishop or Priest”. Television gave rise to Roger Ailes and others who would reshape the appearance of political candidates for public office. Most notable in the 1968 Presidential primary campaign of Richard Nixon. Nixon’s “funny-looking” appearance would be carefully marketed by only distributing selected shots, accompanied by strong sound tracks and professional flattery – in the process beating Nelson Rockefeller and former president Ronald Regan. 

Gaughan concludes his analysis with the emergence of the internet. Senator Barack Obama managed to outfox Senator Hillary Clinton in the 2008 Presidential primary campaign by leveraging social media that allowed his campaign to build a volunteer network that utilized data analytics to identify potential voters and could eventually outperform the Clinton campaign in the majority of voting metrics. The internet also offered a platform to establish direct communication with his constituents. Obama’s success, however, inspired a real-estate business man and reality TV-celebrity from the other end of the political spectrum. It gave rise to Donald J. Trump. Trump took the Regan playbook of celebrity fame gone politics and merged it with 21st century innovations. On Twitter and Facebook, the Trump campaign selectively spent ads in crucial swing states to gain political momentum with polarizing memes and divisive content. His existing television fame helped with national recognition, but the free coverage generated through the power of social networks put him over the edge to beat established Republican candidates, front and center of the voter.

The paper concludes that the 2016 Presidential primary campaign was a harbinger of things to come. It is not far-fetched to reason that internet communication will continue to boost political speech across new platforms like TikTok or through new mediums such as virtual or augmented realities. Political candidates entering a primary race can leverage these tools by hiring campaign staff who are native in social media communications, possess the ability to detect the pulse of not only millennial and adolescent voters but the party’s voter base beyond retirement age and everybody in between. New tools to analyze, scale and engage audiences that most platforms offer as part of the advertisement deal have the power to enable political novices to make a bid for office. From a regulatory point of view, legislators must revisit campaign spending to level the playing field for networking effects that come with social media. In the interest of the voter, fair and democratic elections, it might be advised to not focus future legislation on campaign spending in the sense of financial assets but the actual reach of audience including the means to facilitate the reach.