Introduction & How to Use This Document
This document serves as a guide for civil society groups, academic researchers, governments, and other tech companies wanting to know more about the resources and information that GIFCT member companies make available about their efforts to counter terrorist and violent extremist activity and tools developed to combat forms of online radicalization.
In each section we have linked directly to resources on each topic developed by GIFCT member companies as well as resources from GIFCT and its partners at the Global Network on Technology and Extremism (GNET) and Tech Against Terrorism.
Community Standards
Most platforms require users to agree to set terms of service before accessing a platform’s tools. While terms of service give legal parameters for usage, content standards explain in a less legally formatted way what is and is not allowed to be shared on a given tech platform. These guidelines are put in place to ensure users can use the platforms freely and safely, while understanding where they might cross the line and have content removed or engagement on the platform restricted. These guidelines are generally global. Most companies design, review and update these standards based on feedback from a range of stakeholders that might include their users, government bodies, and global experts in fields such as technology, public safety and human rights. GIFCT members enforce their own respective policies and conduct their own practices in response to violations of their terms of service or standards such as content removal and account disabling.
Transparency
In addition to explicit platform policies that prohibit terrorist and violent extremist content or activities, every GIFCT member is required to publish, at a minimum, an annual transparency report that reviews elements of how standards are upheld and enforced. Though exact format and content presentation vary from company to company, these reports tend to provide data on basics around government data request processing and wider removal compliance. As a company has more resources and ability to build out more nuanced metrics within a transparency report, these reports might also include data on the appearance and removal of specific types of violating content and the amount of content on which the platform took action. In the most advanced examples of transparency reporting to date these reports sometimes include rates of appeals to report on false/positive removal rates, and even the prevalence of violating content on the platform over time.
Safety Hubs
Often, activists, academics, journalists, and practitioners in the counterterrorism and counter-extremism space have to find ways to mitigate the risks to their personal safety online more than an average user. By using online platforms to research or challenge hate-based extremism and terrorism, an individual can often put themselves or the sensitive communities they engage with online at risk. Safety concerns also tend to arise for many activists and NGOs who often have public facing interactions with different communities and as a result tend to face several risks online. Platforms’ safety guidelines and resources are, therefore, increasingly important to ensure that users know how to flag abuse, manage privacy settings, and report things such as potentially hacked accounts or nefarious activities of dangerous organizations.
Reporting Mechanisms
Related to broader safety concerns, another GIFCT membership requirement states that all members must have a functional way to “receive and act on reports of illegal activity or activity violating terms of service.” While members approach this in a variety of ways, all member companies provide public resources and tools to assist users in reporting illegal and prohibited activity and content.
At the very least a publicly available email address or direct contact portal is available, by which users may contact the company and report content or activity in violation of the platform’s guidelines. Mailchimp, MEGA, and WhatsApp all provide basic outreach portals of this. For companies like MEGA and WhatsApp, which feature end-to-end encryption, this is the most efficient way to report abuse because the company does not otherwise have access to the content and activity of users. JustPaste.It users can directly email the platform at [email protected] – instructions for properly submitting information about violating content can be found here.
Counterspeech and Counter-Narrative Facilitation
While safety tools allow users to flag harm and on-platform violations that lead to possible removal of content, we also know that removal alone will never solve for the root causes of radicalization and hate-based extremism. Larger social media platforms have realized that their platforms also house crucial tools for activists and practitioners to develop content and online activities for prevention purposes. Some GIFCT members have developed tools that can optimize community-level voices that challenge hate speech and extremism. Supporting counterspeech and counter-narrative efforts emerges out of a recognition that deplatforming and removing content only addresses symptoms of radicalization, rather than causes.
Digital Literacy
While counterspeech and counter-narrative campaigns and initiatives look to disrupt a cycle or pathway of hate-based radicalization, many practitioners and researchers have highlighted that preventing violent extremism needs to start with better comprehensive digital literacy for online audiences (young and old). Digital Literacy is a broad term that refers to a range of skills and information that allow individuals and organizations to safely and effectively operate in digital environments by questioning online information and having the tools to discern opinion from fact. While some of the material shared here is targeted at younger people, it is equally useful to those unfamiliar with operating on digital platforms and those who wish to engage younger audiences (activists, NGOs, teachers) by providing the basics in clear, straightforward language that can be easily communicated to others.
Advertising and Marketing
Some GIFCT member companies offer advertising and marketing tools for their users based on how they function and monetize as a platform. While these tools are often developed for more traditional commerce and marketplace use cases, we also know that these tools can be optimized by activists, NGOs and practitioners to better launch their messaging to target audiences and further counterspeech goals. Some companies have advertising and marketing tools to ensure that a range of organizations and/or activist’s campaigns and voices reach the relevant target audiences and can be effectively measured.
Appendix A: GIFCT Membership Criteria and Benefits
Membership Criteria
GIFCT member companies are listed on our website here. In order to join GIFCT, companies must meet the following requirements:
- Content standards that explicitly prohibit the promotion of terrorism in their terms of service, community guidelines, or other publicly available content policies
- The ability to receive and act on reports of illegal activity or activity violating terms of service
- A desire to explore new technical solutions to content and conduct challenges
- Regular, public data transparency
- A public commitment to respecting human rights, particularly free expression and privacy, when implementing content removal policies
- Support for expanding the capacity of civil society organizations to challenge violent extremism
If a company looking to join the GIFCT does not meet certain requirements, GIFCT offers that company mentorship through our partnership with Tech Against Terrorism.
Membership Benefits
- Potential access to GIFCT’s hash-sharing database and URL sharing program
- Participation in crisis response communications around international terrorist and violent extremist events with online implications
- Briefings from scholars associated with the Global Network on Extremism and Technology (GNET), the academic research arm of GIFCT
- Briefings on technological approaches and solutions
- Priority participation in topical workshops, e-learnings and webinars with global experts
Contact: for any questions contact us at [email protected].
Sign up to receive email updates from the Global Internet Forum to Counter Terrorism
Appendix B
GIFCT and Partner Resources Index
Partner | Webpage Title | Link |
GIFCT | Membership Pillars | Click here to view |
GIFCT | Transparency | Click here to view |
GIFCT | Working Groups | Click here to view |
GIFCT | News Page and Newsletter Subscription | Click here to view |
GIFCT | Campaign Toolkit | Click here to view |
GIFCT | Digital Security Checklist | Click here to view |
GIFCT | Email Updates | Click here to view |
GIFCT | The Definitions and Principles Framework Project | Click here to view |
TAT | Mentorship | Click here to view |
TAT | Workshops and Events | Click here to view |
TAT | Knowledge Sharing Platform (KSP) | Click here to view |
GNET | Homepage: Insights and Reports | Click here to view |
Member Resources Index
Member | Webpage Title | Link |
Airbnb | Community Policies | Click here to view |
Airbnb | Transparency | Click here to view |
Airbnb | Safety and accessibility | Click here to view |
Airbnb | Neighborhood Support | Click here to view |
Airbnb | Trust and Safety | Click here to view |
Airbnb | Host Resource Center | Click here to view |
Airbnb | How do I report a message or block someone on Airbnb? | Click here to view |
Airbnb | How do I report discrimination to Airbnb? | Click here to view |
Amazon | Community Guidelines | Click here to view |
Amazon | Security and Privacy – Law Enforcement Information Requests | Click here to view |
Amazon | Security and Privacy | Click here to view |
Bitchute | Community Guidelines | Click here to view |
Bitchute | Guidelines Enforcement Process | Click here to view |
Bitchute | Transparency Reporting | Click here to view |
Clubhouse | Trust and Safety page | Click here to view |
Clubhouse | Knowledge Center | Click here to view |
Clubhouse | Community Guidelines | Click here to view |
Discord | Community Guidelines | Click here to view |
Discord | Transparency Report | Click here to view |
Discord | Safety Center | Click here to view |
Discord | Trust and Safety – How to Properly Report Issues to Trust and Safety | Click here to view |
Dropbox | Acceptable Use Policy | Click here to view |
Dropbox | Transparency Overview | Click here to view |
Dropbox | “How to report inappropriate content on Dropbox | Click here to view |
Community Standards | Click here to view | |
Transparency Reports | Click here to view | |
Digital Literacy Library | Click here to view | |
Community Standards – Dangerous Individuals and Organizations | Click here to view | |
Combating Hate and Extremism (FB News, 2019) | Click here to view | |
What Are We Doing to Stay Ahead of Terrorism | Click here to view | |
Facebook Counterspeech | Click here to view | |
Social Impact | Click here to view | |
Safety Center | Click here to view | |
Safety Center – Law Enforcement | Click here to view | |
Safety Center – Parents Portal | Click here to view | |
Safety Center – Empowering Youth | Click here to view | |
Safety Advisory Board | Click here to view | |
Policies and Reporting – How to Report Things | Click here to view | |
GIPHY | Community Guidelines | Click here to view |
GIPHY | Submit a request or report | Click here to view |
GIPHY | 2021 Transparency Report | Click here to view |
Abuse and Content Policies | Click here to view | |
Transparency Report | Click here to view | |
Report Content for Legal Reasons | Click here to view | |
Community Guidelines | Click here to view | |
Transparency Reports | Click here to view | |
Abuse and Spam – Report | Click here to view | |
Community Resources | Click here to view | |
JustPaste.it | Terms of Service | Click here to view |
JustPaste.it | Transparency Report | Click here to view |
JustPaste.it | Abuse Reporting | Click here to view |
Professional Community Policies | Click here to view | |
Transparency | Click here to view | |
Report Inappropriate Content, Messages, or Safety Concerns | Click here to view | |
Mailchimp | Acceptable Use Policy | Click here to view |
Mailchimp | Transparency Reports | Click here to view |
Mailchimp | Abuse Desk | Click here to view |
MEGA | Terms of Service | Click here to view |
MEGA | Transparency Report 2022 | Click here to view |
MEGA | Contact Us – Report Objectionable Material | Click here to view |
MEGA | Takedown Guidance Policy | Click here to view |
Microsoft | Report Hate Speech Content Posted to a Microsoft Hosted Consumer Service | Click here to view |
Microsoft | Corporate Social Responsibility – Reports Hub | Click here to view |
Microsoft | Digital Literacy | Click here to view |
Microsoft | Digital Safety Content Report | Click here to view |
Microsoft | Report Terrorist Content Posted to a Microsoft Consumer Service | Click here to view |
Microsoft | Register with Microsoft Nonprofits | Click here to view |
Microsoft | Online Safety Tips | Click here to view |
Microsoft | Online Safety – Resources and Research | Click here to view |
Microsoft | Security | Click here to view |
Microsoft | Report a Concern to Bing | Click here to view |
Microsoft | Expanding our commitments to countering violent extremism online | Click here to view |
Microsoft | Online Safety – Promoting Digital Civility | Click here to view |
Microsoft | Request to Reinstate Disabled Content | Click here to view |
Microsoft | Microsoft Services Agreement with Code of Conduct | Click here to view |
Microsoft | Xbox Community Standards | Click here to view |
Microsoft | Human rights statement | Click here to view |
Microsoft | Microsoft’s approach to terrorist content online | Click here to view |
Microsoft | Bing’s Webmaster guidelines | Click here to view |
Niantic | Player Guidelines | Click here to view |
Niantic | Product Support | Click here to view |
Niantic | Security | Click here to view |
Niantic | Transparency Report | Click here to view |
Community Guidelines | Click here to view | |
Transparency Report | Click here to view | |
Help Center – Safety – Report Something on Pinterest | Click here to view | |
Help Center – Safety – Get More Help – Report a Policy Violation | Click here to view | |
Tumblr | Community Guidelines | Click here to view |
Tumblr | Transparency Report | Click here to view |
Tumblr | Reporting Content | Click here to view |
Tumblr | Report Abuse | Click here to view |
Tumblr | Report Suspected Terrorism Content | Click here to view |
Twitch | Safety Center | Click here to view |
Twitch | Community Guidelines | Click here to view |
Twitch | Transparency Report | Click here to view |
Twitch | Filing a Report | Click here to view |
Twitch | Human Rights Impact Assessment | Click here to view |
Twitch | Guide for Parents and Educators | Click here to view |
Twitch | Connect Safely’s Parent’s Guide to Twitch | Click here to view |
The Twitter Rules | Click here to view | |
Campaigning on Twitter – The Handbook for NGOs, Politics and Public Service | Click here to view | |
Transparency | Click here to view | |
Safety and Security | Click here to view | |
Brand Safety | Click here to view | |
Advertising – Targeting | Click here to view | |
Advertising – Campaign Types | Click here to view | |
Analytics | Click here to view | |
Business Resources and Guides | Click here to view | |
Corporate Philanthropy | Click here to view | |
Hateful Conduct Policy | Click here to view | |
@TwitterSafety (Twitter Account) | Click here to view | |
Help – Safety and Security – Sensitive Content | Click here to view | |
Business – Twitter Video Resources | Click here to view | |
Rules and Policies – Report Violations | Click here to view | |
Teaching and Learning with Twitter Media and Information Literacy | Click here to view | |
How to Use WhatsApp Responsibly | Click here to view | |
Safety Tips and Security | Click here to view | |
Unauthorized use of automated or bulk messaging on WhatsApp | Click here to view | |
Blocking and Reporting Contacts | Click here to view | |
WordPress.com | Report Abuse | Click here to view |
WordPress.com | User Guidelines | Click here to view |
WordPress.com | Transparency Report | Click here to view |
WordPress.com | Report a Site | Click here to view |
WordPress.com | Terrorist Activity Policy | Click here to view |
YouTube | Community Guidelines | Click here to view |
YouTube | Transparency – Community Guidelines Enforcement | Click here to view |
YouTube | Safety Resources | Click here to view |
YouTube | Safety Resources – Suicide and Self-Injury Policy | Click here to view |
YouTube | Safety Resources – Parent Resources | Click here to view |
YouTube | Safety Resources – Teen Resources | Click here to view |
YouTube | Safety Resources – Educator Resources | Click here to view |
YouTube | Policies – Violent extremist or criminal organizations policy | Click here to view |
YouTube | Hate Speech Policy | Click here to view |
YouTube | Advertising | Click here to view |
YouTube | Creators Resources | Click here to view |
YouTube | Google for Nonprofits – YouTube Nonprofit Program | Click here to view |
YouTube | Social Impact | Click here to view |
YouTube | Help Center – Report Inappropriate Content | Click here to view |
YouTube | Help Center – Other Reporting Options | Click here to view |
YouTube | Help Center – Report a YouTube search prediction | Click here to view |
YouTube | Help Center – Trusted Flagger Program | Click here to view |
YouTube | How YouTube prevents radicalization | Click here to view |
YouTube/Google | Request Google Nonprofit Account | Click here to view |
Zoom | Terms of Service | Click here to view |
Zoom | Trust Center | Click here to view |
Zoom | Security Overview | Click here to view |
Zoom | Privacy Practices | Click here to view |
Zoom | Compliance | Click here to view |
Zoom | Trust and Safety Center | Click here to view |
Zoom | Privacy Statement | Click here to view |
Zoom | Transparency Report | Click here to view |
Zoom | Community Standards Enforcement Transparency Report | Click here to view |
Zoom | Acceptable Use Guidelines | Click here to view |
Zoom | Expression, Safety and Process at Zoom | Click here to view |