You are here

The CFI project at the 2024 London FIMI Forum

14 October 2024
Image of a panel of speakers © EU delegation to the UK

On 14 October, the CFI Project participated in the 2024 London FIMI Forum, titled “FIMI as a Global Threat in a Year of Elections”. This annual forum is organised by the EU Delegation to the United Kingdom of Great Britain and Northern Ireland.

The project co-organised the fourth panel discussion, "Security Implications of the Evolving FIMI Threat", focusing on Foreign Information Manipulation and Interference (FIMI) within the context of traditional security and defence.

A recording of the event is available to rewatch

Speakers:

  • Margarita Akhvlediani, Programme Director, GO Group Media/JAMnews, Georgia
  • Carrie Goux, Acting Deputy Coordinator for Operations, Global Engagement Center, U.S. Department of State
  • Dr Nad’a Kovalcikova, Senior Analyst and CFI Project Director, EUISS
  • Sabrina Spieleder, Head of Information Environment Assessment, Public Diplomacy Division, NATO

Jessica Cecil moderated the panel.

Key takeaways: 

  • FIMI poses a significant threat to democracies worldwide; its global reach could further undermine public trust in institutions and democratic processes and influence the political outcomes of democratic processes. Yet, FIMI is also increasingly a security threat, not limited to the information space and forms part of hybrid threats. 
  • The upcoming elections in Georgia will be a battleground for narratives. Processes happening online, with the help of AI, will have a real-life impact on the people of Georgia. The funding of disinformation efforts should be further studied. Moreover, the label “foreign information manipulation” is used by both sides to delegitimise the other. Russia is identified through increasing evidence to be behind these well-coordinated manipulative efforts in Georgia. 
  • Identifying hostile actors and countering evolving FIMI tactics, techniques and procedures remains however difficult due to the blurred lines between state and non-state actors, the engagement of local proxies and the rapid advancement of technology. The session highlighted the critical role of technology, the growing use of AI and the spread of improving deepfakes in Russia’s war of aggression in Ukraine and beyond. 
  • Last year, the Global Engagement Centre (GEC) in the US uncovered a Russian campaign in Latin America. It was critical of Ukraine, NATO and the West. More recently, with UK and Canadian partners, the GEC was able to dismantle an operation aiming to subvert the government in Moldova, with a joint diplomatic campaign to reach partner countries. The GEC also launched the ‘Framework to Counter State Information Manipulation’ as a common operating model, built on 5 key areas: (1) national strategies and policies; (2) governance structures and institutions; (3) human and technical capacity; (4) civil society, independent media, and academia; and (5) multilateral engagement. 
  • International partnerships are at the core of an effective response to FIMI. Building coalitions of like-minded partners is essential for sharing intelligence, coordinating responses, and developing shared strategies. The G7, for example, has been actively engaged in efforts to counter FIMI via the Rapid Response Mechanism (RRM). The EEAS definition of FIMI is at the centre of this coalition-building effort. 
  • Hostile influence operations enabled by artificial intelligence (AI) are also increasingly disrupting democracies during peak events of elections, Olympic games, or other societal momentums. The rise in AI-driven disinformation cases highlights evolving threats such as deepfakes, voter confusion over AI authenticity, and unethical campaign practices. 
  • NATO has invested in the role of its Public Diplomacy Division (PDD) to enhance civilian-military cooperation in the realm of strategic communications. For example, by cooperating with military colleagues on military signalling and military exercises via strategic communications. 
  • Creating effective policies and regulations is crucial to address the challenges posed by FIMI. This includes measures to protect critical infrastructure, promote media literacy, and regulate the standards of development and use of technology in information warfare. Societal resilience is the key to preparedness. Fostering societal resilience is essential for mitigating the disruptive impact of FIMI. This involves strengthening democratic institutions, promoting civil society engagement, and investing in education and research. While preserving freedom of speech, it was noted that it does not equal freedom of reach exemplified in the case of the banning of RT and Sputnik, news outlets operating in connection with Russian intelligence services to undermine countries in the EU and elsewhere. 
  • Addressing FIMI requires continuous innovative thinking and holistic approaches, for example, the “total defence” concept (e.g., lessons from Sweden, and Singapore) as well as the enhanced EU-NATO countries’ cooperation through the European centre of excellence to counter hybrid threats, NATO Centre of excellence on strategic communications in Riga or new and expanded initiatives to bolster societal resilience to FIMI. 

Moreover, the CFI project organised a closed-door workshop “From Bias to Brain Warfare? Enhancing Cognitive Security Against Malign Actors”, which emphasised the following: 

  • There is an increasing fragmentation of state-led influence operations aimed at circumventing sanctions and regulatory frameworks. A network approach is being employed, with a reorientation towards new audience pools. Offline tactics such as sabotage, cyberattacks, and impersonation are being integrated into Foreign Information Manipulation and Interference (FIMI) operations. 
  • Escalating domestic wedge issues are being leveraged through mis- and dis-information, deepening societal polarisation. AI-generated content, particularly image manipulation, is increasingly being used to enhance the effectiveness of these influence operations, posing a cognitive threat to individuals' capacity for independent thinking. 
  • Open-source intelligence (OSINT) and network analysis can play a key role in enhancing cognitive security. These tools enable more accurate risk assessments related to disinformation and support proactive responses to information manipulation. 
  • Effective countermeasures to influence operations require both state-level strategies and whole-of-society cooperation. National examples highlight successful attribution practices and collaboration, while broader societal engagement is needed to monitor, evaluate and respond to evolving disinformation tactics. 
  • Public disclosures through intelligence assessments and threat advisories can help expose and deter FIMI activities. By signalling capability and adhering to international norms, such disclosures also build institutional trust and reinforce the legitimacy of defensive actions.