Categories: Election


Monitoring Online Discourse During An Election: A 5-Part Series

How to monitor social media with AI-based tools during an election campaign

 Traditional election monitoring is a formalized process in democratic countries, set out in the mandate of the national Electoral Management Body (EMB). As social and digital cultures change, however, EMBs are finding it useful to expand their monitoring capacities to include social media.

Given the media coverage of interference with the 2016 US and UK elections, and the fallout from the Cambridge Analytica debacle, politicians and the public are wary of the impact social media manipulation can have on electoral processes.

As well, automated tools like bots, created locally or outside the country, disrupt the existing system. They amplify political messaging, yet are not currently covered by political financing regulations; and they can disseminate disinformation.

Tracking social media allows an EMB to stay on top of operational issues during an election period (see Part III: Managing Operational Issues) and also to detect and track disinformation and its spread (see Part II: Identifying Disinformation).

This Playbook is designed for EMBs. As a template, it will obviously need to be adapted, depending on jurisdiction. It describes social media monitoring as a function within an EMB, and assumes that the EMB has access to a AI-based social media monitoring tool, such as KI Social.

To be effective, this process should be put in place well before the first milestone of the election period.


1.    Setting up

Before the project begins:

·         Ensure there is clarity regarding the goals of the social media monitoring. A key issue is scope: does the EMB mandate include monitoring of operational issues, or disinformation and misinformation, or all of the above? (see Part III: Managing Operational Issues).

·         Does the mandate include tracking voter issues expressed within national borders, or will it also include voters travelling or living abroad? This is important for the many nations with large expatriate communities. If results should include social media posts from voting citizens residing outside their country, it won’t be possible to limit data by geographic location.

·         Will the monitoring function be active (occurring continuously), passive (occurring once a week, for example), or retroactive (taking place after each election milestone is completed)?

·         In this phase, EMBs should inventory its internal staff capacity; for example, does its staff include social media content producers, policy personnel, social media analysts, social media monitors, or data scientists? If not, arrange with an experienced vendor such as KI Design to provide these services.


Technical set-up: The technical team within the EMB should document:

a.    All EMB web and social media assets

b.    All relevant national and international news sites

c.    Lessons learned from the previous election

d.    A list of all confirmed candidates when it is finalized

e.    All political party data and web assets

f.     Details of political spending on Twitter, Reddit, Facebook pages, and other platforms

2.    Acquire a social media service provider with the following capabilities:

a.    Full firehose access to data, going back at least to one previous election. It’s vital to be able to analyse data from the previous election, to understand what potential issues may occur in the current one. For example, there may be specific complaints related to a particular location, or to the capacity of polling station staff. That said, historic analysis will not provide a complete picture; new issues and ambiguities will arise.

b.    Ability to track keywords that are not necessarily related to elections; for example, power outages, roadblocks, protests, etc.

c.    Geolocation capacities:

i.    For posts with geolocation tags, the tool should display post locations on a map. For example, posts may complain about ballot non-delivery in a certain region.

ii.    For posts without geolocation tags, the tool should have the ability to group them and map them visually, to show concentration; for example, a post without a geolocation tag may state “unable to find [named] polling station on EMB website”; the visualisation component is important so that logistical issues can be dealt with collectively rather than as individual instances.

d.    The tool should permit custom views for various EMB staff skillsets. For example: content producers would want to measure the volume of incoming and outgoing messages on the EMB’s social media channel; data scientists may want to write sophisticated queries; issue managers may need views that show whether or not posts have been responded to.

e.    Your vendor should be able to provide data science analytics and application dashboard customization expertise.

f.     Your vendor needs to have experience in provisioning data science services to EMBs.

3.    Noise elimination

A query contains an expression that’s composed of keywords, emojis, and urls to be tracked. These will include both keywords you are looking for, and many keywords that you don’t want to be in the search. For example: In a national election, if the query contains the phrase “election monitor,” the result could include any and all election monitoring occurring anywhere in the world, as well as any election monitoring within municipalities, cities, unions, associations, or the UN, or from other regional-level elections.

In an election context, without noise elimination, some 90% of the results of a query are irrelevant. Hence, a large portion of the query should be dedicated to eliminating these irrelevant results.

Issues to be aware of include:

a.    Elections in other countries may be taking place concurrently; for example, an Indian provincial election and a national UK election. Especially if both countries share a common language, online discourses may include overlapping content; for example, place names or street names.

b.    Name similarities of candidates with other citizens.

c.    Election talk on social media will be dominated by countries with higher per capita access to the Internet, and in particular those whose citizens most frequent Twitter; such as the US, UK, and France.

d.    In multi-lingual countries, where many unofficial languages are spoken, queries should aim to capture election discourse in languages other than the official one/s. With that comes the need for noise elimination related to the nation/s where those other languages are dominant.

4.    Your social media monitoring tool should include three distinct functions:

a.    Dashboards and reports: Real time, periodic (e.g., every four hours), daily, weekly, or monthly

b.    Data feeds: Each feed is dedicated to:

i.    Operational issue/s

ii.    Capturing of EMB’s footprint; this means that this feed would be dictated to finding any occurrences of posts that mention the EMB, its leadership, or the relevant legislation

iii.    All parties and all candidates (including any events or investments or announcements by the political parties)

iv.    Data feeds specific to disinformation and misinformation

c.    Alerts of any media mentions that are of particular interest to the EMB

5.    Create specific filters to target election milestones

a.    Content regarding election steps prior to election day:

i.    Voter registration

ii.    Ballot mailing

iii.    Citizens moving residence

iv.    Allowed pieces of identification

v.    Election date

vi.    References to bias by EMB officials

vii.    Impersonation of EMB or political candidates

b.    On voting days (advance polling and election day):

i.    Lineups

ii.    Availability of paper ballots

iii.    Registration issues

iv.    Staff issues

v.    Directions to polling station

vi.    Power outages

vii.    Poll relocation

c.  Ballot counting hours: Analysing concerns and content appearing after polling stations are closed and before the results were issued.

d.  Post-election reporting: providing aggregate data on the monitoring activity and the number of situations averted, mitigated, or responded to.


Part of a 5-part series on

Monitoring Online Discourse During An Election:

PART ONE:  Introduction

PART TWO:  Identifying Disinformation

PART THREE:  Managing Operational issues

PART FOUR:  KI Design National Election Social Media Monitoring Playbook

PART FIVE:  Monitoring Political Financing Issues

Categories: Election


Monitoring Online Discourse During An Election: A 5-Part Series

The advantages of managing election logistical issues through social media.

Organizing the logistics of an election is a complex process. It’s a question of scale; the sheer numbers involved – of voters, of polling options and locations, and of election materials – means that things can, and will, go wrong.


  • Delay in receiving ballots in the mail
  • Questions about options of voting electronically or by mail
  • Incorrect name or address on ballot
  • How to find information on where to vote
  • Confusion regarding polling station location
  • Confusion regarding the hours that polling stations are open
  • Accessibility issues
  • Confusion regarding what ID to bring
  • Power outages that impact polling stations
  • Road blocks and construction impeding access to a polling station
  • Whether polling hours are delayed
  • Whether there is a long line-up and voting is delayed
  • Availability, and courtesy, of EMB personnel
  • Conflicts at the polling station
  • Issues re third-party election monitors (if applicable)
  • Police presence
  • Dead people and non-citizens voting

Operational issues can be divided into two types. There are logistical concerns, such as:

As well, there are problems caused by the propagation of disinformation (or misinformation).

What role is played by Disinformation?

There can often be an overlap between Operational Issues, Disinformation, and Misinformation. Tweets regarding the location of a particular polling station fall into the Operational Issues category, but that information may be mistaken (Misinformation) or deliberately misleading (Disinformation). There is an almost complete overlap between Disinformation and Misinformation – the only difference is the intent behind the sharing of inaccurate information.

As the table below demonstrates, many Operational Issues may also become targets of Disinformation or Misinformation.


Why should EMBs monitor social media?

EMBs have a formal complaints process, and if concerns are raised outside that process, EMB staff are not obligated to respond. However, given the pervasive nature of social media, vexed voters are much more likely to grouse on the Internet than to file a formal complaint. Social media has become an informal complaints process; Twitter in particular. With its use of hashtags, Twitter dominates social media election discourse. (Election discussion on Facebook, Telegram, and WhatsApp takes place on private pages.) The chart below shows social media discourse around the 2019 UK general election with the hashtag #GE2019, by volume.

What can EMBs do about social media-based complaints?

Complaints can fall into one of several categories:

Social media as a mass communication tool: Social media messaging can mitigate public discontent, respond proactively to problems, and send broad messaging demonstrating that the EMB is in control of the situation. For example: after complaints of robocalls which state the election date has changed, the EMB could tweet that these robocalls are giving false information and should be ignored. Such messaging will be picked up by traditional media.

When should EMBs monitor social media?

Monitoring should occur throughout the election period, Election milestones tend to be flashpoints when online discourse increases – these are highlighted in the diagram below.


Part of a 5-part series on

Monitoring Online Discourse During An Election:

PART ONE:  Introduction

PART TWO:  Identifying Disinformation

PART THREE:  Managing Operational issues

PART FOUR:  KI Design National Election Social Media Monitoring Playbook

PART FIVE:  Monitoring Political Financing Issues




Categories: Election

Identifying Disinformation — Part II

Monitoring Online Discourse During An Election: A 5-Part Series

Using AI to track disinformation during an election campaign.


How can online disinformation be identified and tracked? KI Design provided social media monitoring solutions for the 2019 Canadian federal election.[1] KI Social is a suite of tools designed to support three main areas of Electoral Management Board (EMB) electoral monitoring as it pertains to social media:

Disinformation: False information spread deliberately to deceive, including:

Operational issues: Problems related to practical aspects of the voting process.

Political financing issues: This can be divided into two main categories:

Online voter discourse occurs in waves. It generally peaks in the event of any significant political incidents, and around election milestones such as:

In providing monitoring of online electoral discourse, KI Social analyses posts originating on Facebook and Instagram pages, Twitter, Reddit, Tumblr, Vimeo, YouTube (including comments), blogs, forums, and online news sources (including comments). The platform provides real-time monitoring, sentiment and emotion analysis, and geo-location mapping.

Using AI analytics and classification mining, KI Social can identify disinformation and misinformation sources, discourses, content, and frequency of posting. The platform maps relationships between various disinformation sources, and within content.

How Disinformation Impacts Elections, and Voters

Disinformation undermines democracy. That’s its purpose. Usually extreme in nature, fake news polarizes people, creating or exacerbating social and political divisions, and breeding cynicism and political disengagement. Even the reporting of disinformation campaigns (Russian electoral interference, for example) adds to the destabilization, making people wary of what to believe.

“Over the past five years, there has been an upward trend in the amount of cyber threat activity against democratic processes globally…. Against elections, adversaries use cyber capabilities to suppress voter turnout …”[2]

Disinformation campaigns often focus on election processes. The aim is to lower voter turnout, by preventing people from voting, or simply by making them less inclined to do so. This can play out in different ways.

Below, I list examples of some of the topics of disinformation, taken from posts from different countries, that can be found in the social media universe during an election period.

Making it harder for people to physically vote

False information gets spread about the location of a polling station, or its hours of operation, or power outages on site causing long line-ups. Fake news like this creates confusion, making people less likely to vote.

 Undermining voter trust in the EMB

Findings included:


This was fake news – no-one can vote in a federal election in Canada unless they have become a citizen. Other posts criticized the government for allowing prisoners to vote, even though this is a well-established right under the Canadian Charter of Rights and Freedoms. Both these types of posts foster a sense of disenchantment with “the system.”

Other posts:

Undermining voter trust in the electoral process generally

Findings included:

Polarizing the electorate


Other sources of disinformation include:


Identifying Disinformation, and Those Who Disseminate It

In monitoring elections, the social media analyst is confronted with an enormous amount of data. The key to accessing and interpreting that data is the keywords and queries the analyst chooses to use. To be effective, these must be shaped by a close understanding of the political context. This process is “highly selective,” notes Democracy Reporting International. “It is not possible to have a comprehensive view of what happens on all social media in an election. Making the right choices of what to look for is one of the main challenges of social media monitoring.”[4]


To help circumvent this challenge, the KI Social platform:


When analysts track disinformation, they usually have preconceived ideas of what it will look like. EMBs are very familiar with mainstream media, and how to track potential disinformation within it (false claims by politicians, for example). Traditional manual modes of monitoring rely on this history of prior examples, and require that human monitors read and analyze every single post to decide whether it’s disinformation.


Some EMBs expand their capabilities by leveraging automated tools. However, standard automated data queries are still based on prior examples and thus are error prone.


A third way of tracking disinformation is via AI-based tools. Such tools avoid these pitfalls by allowing the analyst to track unprecedented volumes of data, as well as its location and context and the sentiment being expressed. Negative emotion is key, as it is the main determinant for disinformation.


The diagram below illustrates how KI Social’s methodology can be used by EMBs to track disinformation.


Removing unwanted data is conducted at every stage of the process (for example, if analysts are studying an election in France, and there are elections in Ivory Coast at the same time, the Ivory Coast data will need to be filtered out of the results).


Disinformation can be expressed as an algorithm:


Disinformation =       (hate OR distrust OR obfuscation) x volume   
                  Election context x (anger OR disgust OR sadness OR fear)


Animated by these queries, KI Social provides the ability to answer the following questions:


Part of a 5-part series on

Monitoring Online Discourse During An Election:

PART ONE:  Introduction

PART TWO:  Identifying Disinformation

PART THREE:  Managing Operational issues

PART FOUR:  KI Design National Election Social Media Monitoring Playbook

PART FIVE:  Monitoring Political Financing Issues


Follow me at @drwhasssan


[1] This article reflects the views of KI Design, and not those of Elections Canada. The full report on how Elections Canada uses social media monitoring tools, including those created by KI Design, in the 2019 federal election can be found here: Office of the Chief Electoral Officer of Canada, Report on the 43rd General Election of October 21, 2019:

[2] Communications Security Establishment, Cyber Threats to Canada’s Democratic Process, Government of Canada 2017, page 5.

[3] All the posts below are in the public domain; nevertheless, we have removed the identity of the poster in the screenshots we provide.

[4] Democracy Reporting International, “Discussion Paper: Social Media Monitoring in Elections,” December 2017, page 2; online at: