Lazily scrolling through TikTok late one Sunday evening, I stumbled across a video featuring former kickboxer Andrew Tate. Oblivious to Tate and his reputation, I watched the video.
In the clip 35-year-old Tate (born in the US but brought up here) argued that men should only date 18-19 year old girls as they’re easier to “imprint” on, and that 26-28 year old girls are less attractive because they have been “f***ed and dumped more times”.
In other now viral social media posts, Tate — a self-described “misogynist” — has also claimed that women in relationships should not be allowed to drive or leave the house alone, and that “if you put yourself in a position to be raped, you must bear some responsibility”.
Tate, who is currently under police investigation in Romania for rape and human trafficking, also stated that “40 per cent of the reason” he moved to Romania is that police are allegedly less likely to puruse rape allegations in the Eastern European countries.
Given his track record of appalling statements, I subsequently tried to ignore Tate as much as possible.
But his videos continued to appear on my feed with alarming regularity. In a few short weeks, videos of Tate or people talking about Tate dominated my social media.
TikTok funnelled his addictive, toxic videos onto my screen and there was little I could do to stop it. These were not videos I was choosing to watch but videos being automatically suggested to me by social media algorithms.
As a result of these algorithms, Tate has become one of the most talked about social media personalities of 2022. According to Bloomberg, TikTok videos which include the tag #AndrewTate have been viewed more than 13.8 billion times and, according to The Tab, in August Andrew Tate was googled more times than Donald Trump and Kim Kardashian.
The issue here is far wider than just Tate. Social media algorithms have the power to shape how millions see events — and to give otherwise obscure figure a major platform.
In the last few weeks, social media companies have begun to take action against Tate. He is now banned from Facebook, Instagram, Twitter and TikTok for posting content that “attacks, threatens, incites violence against” people.
But Tate is just one man, and the issue of these algorithims remains. Social media companies appear unwilling to admit the problem, let alone deal with it. Jews above all know the dangers inherent in this.
One of the main reason why hate-filled antisemitic message do so well is that they are actively promoted by social media companies. The algorithms are programmed to register engagement and boost posts accordingly. It is the ultimate vicious circle: because antisemitic comments lead to engagement, those who post it see their posts doing well and so post more — and those who criticise the comments end up fuelling the spread.
Research shows that algorithms draw users to steadily more extreme content, a phenonmenom termed “algorithimic hate” by crimonologist Matthew Williams.
Consequently, almost every young Jewish boy with a phone has seen videos of Tate. Some find him repulsive, others ridiculous, but some teenage boys have expressed an alarming interest in Tate and his ideas.
“He has a point”, I’ve heard them say. “He doesn’t mean it like that”, others will declare. They will rush to defend his sexist hogwash as “just a joke” and praise his “funny” quips.
His videos are cutting through to a very malleable demographic. According to some, Tate appeals to young boys by talking about fitness and providing easy money-making tips. By doing so Tate is able to reel in a generation of vulnerable, disenfranchised young men.
As a community we are not immune to issues of sexism, misogyny or bullying. It is vital that we stay vigilant against these harmful flash-in-a-pan online personalities.
Jewish parents must be made aware of the content that is being so readily advertised to their young sons.