News: GIFCT and UN CTED Cohost Event on Emerging Technologies

News: GIFCT and UN CTED Cohost Event on Emerging Technologies
20 September 2023 GIFCT
In News

The opening of the UN General Assembly (UNGA) has often been a moment of unique opportunities for the Global Internet Forum to Counter Terrorism (GIFCT). In 2019, GIFCT was established as an independent organization, transitioning from a consortium of tech companies into an expert-led non-profit organization while enacting one of the recommendations from the Christchurch Call.

On the margins of the opening of the 78th UNGA on September 20, 2023, GIFCT and the UN Security Council Counter-Terrorism Committee Executive Directorate (CTED) cohosted a series of events focused on the current and potential impacts of emerging technologies — and in particular artificial intelligence (AI) — on terrorism and counterterrorism. The panels brought together GIFCT member companies and key stakeholders including governments, UN officials, civil society organizations, academics, practitioners, and policymakers.

AI & Counterterrorism: Threats & Opportunities

We started the day by reflecting on the observed and expected exploitation of AI by terrorists and violent extremists and the opportunities for industry and policymakers to develop effective mitigation strategies. A fireside chat featuring our Executive Director Naureen Chowdhury Fink, UN CTED’s David Scharia, Meta’s Neil Potts, and Microsoft’s Courtney Gregoire discussed generative AI’s power to increase platforms’ understanding of the context — including language and cultural competencies — that informs content moderation. They touched on how generative AI has led to stronger measures around the prevention of radicalization, especially to curb recruitment efforts by terrorist and violent extremist organizations, and the enormous potential for tech companies to successfully scale these efforts. But they also pressed the need for continuous transparency around the use of generative AI in the training and decisions that inform moderation online.

The subsequent panel included experts Maggie Engler (Inflection AI), Liram Koblentz (Yale University), and Tom Thorley (GIFCT), with Erin Saltman (GIFCT) as moderator. Panelists explored a range of risks posed by generative AI including the potential to scale exploitation of technology by terrorists and violent extremists. Speakers highlighted the importance of integrating safety-by-design and outlined examples such as safeguards on language models, detection models that move beyond text recognition to also include the detection of online campaigns, the use of watermarking technology and hash-sharing, and efforts to increase digital literacy.

Adversarial Shifts in the Threat Landscape & Intervention Potentials

The afternoon’s sessions showcased the work published so far by GIFCT’s Year 3 Working Groups which brought together diverse stakeholders — including governments, academics, and civil society — whose contributions informed the outcome products aimed at the wider tech ecosystem of stakeholders.

The second session — with Will Allchorn (Richmond American University London), Jennifer Bramlette (UN CTED), Adi Cohen (Memetica), and moderated by Tom Thorley (GIFCT) — highlighted the evolving threat landscape and touched on terrorist-operated websites, the use of chatbots for recruitment, 3D printing of arms, use of drones, exploitation of cryptocurrencies, and malware/ransomware. Panelists also spoke of newer risks from the decentralization of content moderation in the fediverse (a system of independently hosted but interconnected servers), the use of ‘malevolent creativity’ on new social media platforms that feature short-form videos and memes, and limited government capacities and the risks of complacency.

Legal Frameworks, Evaluating Incident Response, & Meaningful Transparency

The third session — with Laura DeBenedetto (Marketplace Risk), Jon Deedman (Richmond American University London), Rita Jabri Markwell (Australian Muslim Advocacy Network), and moderated by Nagham El Karhili (GIFCT) — explored the community impacts of legal definitions of terrorism, frameworks for measuring and evaluating incident response work, and how meaningful transparency can support counterterrorism efforts (Year 3 Working Groups). Speakers explored the values underpinning transparency efforts by technology companies, how to support under-resourced companies in publishing transparency reports, how to operationalize competing definitions of terrorism and violent extremism, and how to evaluate incident response mechanisms.

New GIFCT Members

The commitment and expertise of the tech industry was widely recognized as critical to the development of effective solutions to terrorism and violent extremism online.

We are therefore very pleased to welcome Twitch and Meta as new GIFCT Members, and are grateful to our member companies for their continued collaboration in our shared mission to prevent terrorists and violent extremists from exploiting digital platforms. We look forward to continuing to work closely with Twitch, Meta, our other member companies and our wider community of stakeholders towards a world in which our collective creativity and capacity renders terrorists and violent extremists ineffective online.

A big thanks to our cohosts UN CTED, and to our speakers, moderators, and honored guests for joining us for these rich and substantive discussions.