缅北禁地

CTED Insight Briefing brings technology-driven counter-terrorism into focus

On 27 March 2023, CTED hosted an Insight Briefing on the blind spots in technology-driven counter-terrorism decision-making processes and proposed methods to mitigate these blind spots. CTED/Vijai Singh-Persaud

 

The use of predictive technologies to improve counter-terrorism initiatives, and more specifically border security, was the focus of CTED’s latest Insight Briefing, which was held on 27 March 2023.

One of the main takeaways of the Briefing was that while predictive and probabilistic algorithms, human and signals intelligence, big data analytics, and facial recognition capabilities offer opportunities for Member States’ efforts to address the scourge of terrorism, they also present some challenges.

Noting the Security Council’s guidance in its resolution 2396 (2017) on the benefits of biometrics in counter-terrorism, as well as on the need to improve standards for the use and collection of biometric data in counter-terrorism, the limitations in technology-driven counter-terrorism were outlined and recommendations for mitigating them provided.

During his opening remarks, David Scharia, Director, and Head of the Technical Expertise and Research Branch of CTED, noted that the Briefing was aimed at assisting Member States in identifying methods to improve technology-assisted decision-making processes in the context of counter-terrorism.

The Briefing featured a presentation from Professor Shiri Krebs, Professor of Law, Deakin University, Australia, and a member of CTED’s Global Research Network. The presentation, entitled “Fact and Fiction in Technology-Driven Technology”, detailed how counter-terrorism efforts in the context of airport and border security had increasingly evolved towards preventative counter-terrorism. The benefit of predictive and probabilistic technologies lies in their ability to provide vast amounts of immediate, relevant information, immediately process this information, and identify connections and inconsistencies.

Professor Krebs noted, however, that attempts to prevent terrorist attacks through the identification of suspicious individuals, including from data collected in terrorism watch lists and databases and from law enforcement cooperation, could create false predictions about people and incorrectly assess the risk they pose. This could, in turn, negatively affect the principles of human rights, equality, and privacy, to name just a few. During her presentation, Professor Krebs also explained how technological limitations, limitations surrounding humans’ use of technology and cognitive biases could cause decision-making errors in counter-terrorism risk assessments.

Concluding with a few suggestions for improving predictive counter-terrorism, Professor Krebs cited the need to develop transparent data practices and decision-assisting technologies, develop strengthened and clarified evidentiary standards, and provide capacity-building trainings to assist in de-biasing national and international decision makers.