Statement to UN-CTED on the Terrorist Exploitation of Emerging Technologies

On Friday the 30th of September 2022, Samantha Kutner gave the following statement to the UN CTED during their technical session to inform a special meeting of the counter-terrorism committee on countering terrorist exploitation of ICT and emerging technologies.

Excellencies, distinguished delegates, panellists and participants. I wish to thank our colleagues and partners at the United Nations Security Council Counter-Terrorism Committee Executive Directorate for the invitation to take part in this important meeting, and I wish to thank my fellow panellists for their excellent insights. 

My name is Samantha Kutner. I am a subject matter expert on the Proud Boys, the Proud Boys project lead with the Khalifa Ihler Institute, and the co-founder of Glitterpill LLC.
At Glitterpill and the Khalifa Ihler Institute, we address the challenges posed by violent extremist and terrorist exploitation of ICT and emerging technologies by providing actionable data, analysis and consulting services to companies grappling with the issue of violent extremism and terrorism. 

We do this by mapping networked behaviours across platforms and communities. By understanding network dynamics and the role of individuals and users offline and online, we support teams in understanding who the “superspreaders” or spiders in the networks coordinating and spreading harmful materials and promoting terrorist activities are. This is useful in network disruption, preventing extremist recruitment, coordination and potential attacks. 

It also informed our report for January 6th Select Committee to illuminate the series of mobilizations the Proud Boys engaged in in the events leading up to January 6th. 

One of the key challenges we face is the ambiguity of language and the fact that extremists constantly modify their behaviour online in an effort to circumvent moderation efforts. 

This, in addition to working across languages and cultures within particularly large and transnational linguistic groups, poses extra challenges and requires human insights and a continually updated understanding of coded language and behaviour of groups. As Dr Erin Saltman said, we cannot algorithm ourselves out of this issue. And attempts to moderate are resulting in very real censorship concerns. 

Extremists are likely to use coded language to launder their ideology and evade detection. Human rights defenders, in contrast, may bluntly describe extremist threats. For this reason, their efforts to warn their communities may falsely trigger AI-based accountability measures leading to censorship of human rights defenders.  

Beyond this, content amplification & proprietary algorithms designed to prolong use, encourage engagement, and ultimately make users engage with advertising materials poses extraordinary challenges - in particular when these systems are manipulated by extremists to spread content, fuel polarization and recruit. 

We are embarking on the next horizon of technological development, including Web 3 and a potentially more distributed web. If we do not address these issues at the roots, I fear we will only amplify existing challenges. The web will become more nebulous, moderation efforts will become more difficult, and content more permanently stored. This, in addition to the potential of AI-generated content, deepfakes and the potential misuse of new shared online spaces building on virtual reality technologies, will present unprecedented challenges.

Content moderation difficulties increase when formats do not easily lend themselves to text-based analysis. We already see the challenges of moderating live-streamed content, including content streamed by extremists as they perform terrorist acts. 

While efforts to mitigate this are being made by organizations like the GIFCT, they only address the most high-impact elements of the issue. Platforms like Clubhouse and other audio-oriented platforms pose new challenges in detecting problematic content and behaviours. Video platforms that allow for streaming are also being used for fundraising for extremist activity, which largely goes unmoderated. Beyond this, creative strategies like embedding QR codes for links to external content in imagery, video, and live streams circumvents much of what’s been accomplished in creating shared repositories of URLs used to prevent the spread of links to harmful offsite content on the major social media platforms.

Beyond these creative paths to circumvent content moderation technologies and policy online, we have also recently observed the alarming use of 3d printing technology in the manufacturing of small arms used by violent extremists and terrorists to circumvent gun regulation and commit heinous acts of violence in countries like Iceland and Japan.

Regarding Qanon, we’ve seen new developments in the radicalized conspiratorial landscape. Qanon is a malleable ideology, and the fragmentation after being banned from social media companies allowed different strains to incubate in their own online containers and on fringe websites. This highlights the challenges posed by a more fragmented web.

We’re already seeing the real-world harms resulting from these relatively novel online networks and behaviours, as well as lawfare deployed against companies and human rights defenders attempting to hold extremists accountable. The only way to address these harms is by taking a network approach and understanding the relationship between online and offline behaviours within nebulous networks. These networks transcend national borders, platforms, groups and ideologies. We also need to protect the safety and anonymity of human rights defenders, so warning the public and providing intel that informs incident map data does not immediately paint them as targets. 

Through our unique approach, the Khalifa Ihler Institute and Glitterpill are uniquely positioned to assist the UN, the Member States and the wider community of stakeholders in understanding these behaviours, facilitating network disruption and ensuring the safety of human rights defenders. 

Thank you very much. I yield the floor back to you.


Samantha Kutner