A new project to detect online antisemitism using artificial intelligence has been launched in Berlin.
On Sunday, the Alfred Landecker Foundation commenced “Decoding Antisemitism”, a three-year undertaking in association with scientific institutions including King’s College London and the Centre for Research on Antisemitism at the Technical University of Berlin.
A worldwide team of analysts, computational linguists and historians will aim to develop an AI-driven approach to identifying antisemitism online.
The project hopes to harness the power of computers to assess large amounts of data that humans cannot, whilst being able to recognise antisemitic sentiments expressed in implicit ways.
The foundation also plans to develop an open source tool that can be used to detect hateful content online.
“Antisemitism and hatred directed against minorities are putting the future of our open society in jeopardy, said Alfred Landecker CEO Dr Andreas Eberhardt. “It’s essential that we use innovative approaches – such as using AI – to tackle these issues head on.”
Dr Matthias J Becker, the project lead, noted the connection between online hate speech and hate crimes in wider society.
“In order to prevent that more and more users become radicalized on the web, it is important to identify the real dimensions of antisemitism – also taking into account the implicit forms that might become more explicit over time,” he said.
The project will initially focus on Germany, the United Kingdom and France, but aims to later expand to cover other countries and languages.
Antisemitic hate speech and conspiracy theories have been exacerbated by the coronavirus pandemic.
In July, a report by the Commission for Countering Extremism warned of five “dangerous” conspiracy theories that have appeared online suggesting Jews are the malevolent force behind the pandemic.
A recent CST report also referred to online conspiracy theorists blaming the Jews for spreading coronavirus using 5G mobile phone towers.
Similarly, earlier this month the European Union’s counter-terrorism coordinator expressed concern about the potential rise of “new forms of terrorism, rooted in conspiracy theories and technophobia”, noting a particular rise in violence towards Jews.
In Germany, last week 29 police officers in North Rhine-Westphalia were suspended for sharing neo-Nazi images and using far-right chatrooms, a day after German Chancellor Angela Merkel admitted that “many Jews don’t feel safe and respected in our country”.
Dr Daniel Allington, Senior Lecturer in Social and Cultural Artificial Intelligence at King's College London, criticised tech companies for “failing to stem the tide of online hate.
“The task is difficult because hatred is often expressed in subtle ways and constantly changes form. But machine learning can serve as a force multiplier, extending the ability of human moderators to identify content that may need to be removed.”