Tools Help Track Disinformation from Its Sources as They Try to ‘Slow the Spread’

Hop on over to Facebook and check out a friend or family member’s post about COVID-19, masks or the vaccine. Everyone has a different opinion, defending those opinions with ‘facts’ they’ve heard ‘elsewhere.’ Facebook, as well as other social media channels, works like the old game of telephone among public update feeds, and it is one of the ways disinformation spreads rapidly.

The good news is that PR is working on tools and processes to curb the spread of disinformation. PR trade groups and agencies recognize disinformation as an issue that clouds their clients’ messaging.

Recently Ruder Finn and Edelman joined the ranks of those offering tools to combat disinformation and identify sources of some of the noise. The tools take similar approaches to disinformation in that they go beyond traditional social media monitoring. Both eye fringe sites and platforms on the dark web. In addition, they attempt to understand the sources of disinformation and spike narratives early, before they’ve had a chance to spread.


‘Truth Vector’

Ruder Finn’s Truth Vector is a digital tool that uses machine learning and AI to “analyze social conversation, particularly on fringe social networks,” through what it calls Truth Sonar, according to RF’s CTO Tejas Totade.

“It analyzes how narratives travel across social media and the underlying groups (i.e. factions) that influence and amplify these narratives,” Totade says. “These insights empower the Ruder Finn team to provide predictive and data-driven storytelling recommendations to the client.”

After the tool encounters misinformation, it enables Truth Vectors from receptive influencers to neutralize false information and diffuse viral conversations with fact-based narratives and brand-supportive content, Totade says.

“Truth Vectors is the term we use to define the professional services we will be providing to our clients based on the narrative/faction insights we obtain from the tool,” Totade says. The Truth Vectors can:

  • Counter faction narratives with brand-supportive content
  • Neutralize false information with accurate, persuasive messaging
  • Directly challenge misleading material with factual narratives.

Like Edelman’s Disinformation Shield, the Truth Vector aims to go beyond social listening.

“Social listening tools are typically good at telling what is happening or what has happened, whereas RF Truth Vector addresses the blind spots by answering what can happen,” Totade says. “This enables the team to help the clients’ communication and policy teams anticipate an incident or a viral event and help them prepare a robust and inclusive communication strategy for it.”

‘Disinformation Shield’

The Disinformation Shield is intended to flag, tag and blunt disinformation campaigns before they find their way to traditional and social media news feeds.

Patrick Hillman, EVP, crisis & risk, says it uses predictive analytics tools and human experts in behavioral science.

The Shield, he says, can track more than “50 known foreign and domestic disinformation factions on fringe platforms.”

“It’s important to point out that these factions actually workshop their disinformation narratives for weeks, if not for months, on fringe platforms like 8Kun, 4Chan, Gab, MeWe, among many others,” Hillman says.

Fringe platforms utilize rudimentary APIs (application programming interfaces) that traditional media platforms and social media monitoring tools are unable to track, Hillmann says.

“A proprietary tool allows us to use an AI platform to develop custom ‘crawlers’ to track disinformation on platforms without any API whatsoever,” he says.

“This gives organizations an early warning system and the ability to cast transparency on these actors’ motives before they spread disinformation to more traditional online channels.”

This is how it works.

  • The team activates monitoring to track more than 30 of the largest disinformation platforms. This is intended to uncover active narratives that disinformation factions are pushing so they can harm a company or organization.
  • A team of behavioral scientists monitors daily activity and provides regular reports, tracking disinformation factions and evolving narratives, as well as the potential velocity, level of inauthenticity, and likelihood that false information will gravitate to traditional social media.
  • When a credible threat is identified, a predictive analytics team develops a custom audience model that maps out more than 240 million consumers who are likely targets of the disinformation campaign.
  • The corporate affairs team and behavioral scientists then develop a custom messaging campaign to deliver “inoculating information” to those susceptible audiences.
  • Edelman tracks sentiment in real-time to ensure the intervening messaging is having the desired effect: preventing people from unknowingly sharing false information.

Edelman works with Cambridge University, The Atlantic Council’s DFRLab and others to flag attempted attacks and refine its approach to further insulate organizations from future attacks.

While Hillman doesn’t tell us the names of companies using the tool, he says vaccine safety, American infrastructure stability and faith in the U.S. economy are some of the areas the Disinformation Shield program is monitoring.

Nicole Schuman is Senior Editor of PRNEWS.