I have to admit, I’m a 55-year-old stalker. Not a “weird” kind of stalker, but someone who regularly scrolls through Facebook and Instagram spying on people’s lives, passing judgment on why people over-share and shopping for bohemian jewellery that I don’t need but have succumbed to the temptation.
I convince myself it’s ok because I would never post anything about my life and can’t understand why anyone would.
As a parent, I observe my children surgically attached to their phones, over-sharing our family holidays, their achievements and social interactions.
I choose to turn a blind eye, convincing myself that it’s just the way of the world today and there is nothing we can do to stop it. I don’t understand Snapchat, TikTok or the Metaverse.
However, as wellbeing manager of PaJeS, I recently hosted an online workshop for parents of pupils in Jewish schools as part of Children’s Mental Health Week and Safer Internet Day.
We were fortunate to hear from Imran Ahmed, chief executive of CCDH, the Centre for Countering Digital Hate. CCDH is forcing social media companies to remove hateful or dangerous content, by holding them directly accountable for amplifying and profiting from it.
CCDH says that “social media platforms have become safe spaces for abuse and harmful content, making them potentially hostile environments for normal users.” He quoted the impact that powerful and misogynistic influencers such as Andrew Tate, are having on children, especially on young boys looking for celebrity role models.
Imran informed us about the research they had undertaken on TikTok, where the average viewer spends 80 minutes a day on the application. Naively, I thought TikTok was a platform where children sang and danced to pre-recorded tracks. After hearing from Imran, it was clear there is a generational gap in usage and understanding.
CCDH researchers set up new accounts in the United States, United Kingdom, Canada, and Australia at the minimum age TikTok allows, which is 13 years old. These accounts paused briefly (for less than three seconds) on videos about body image and mental health and “liked” them.
What they found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image.
Information from CCDH revealed every parent’s worst nightmare.The findings showed that some children’s social media feed is bombarded with dangerous and harmful content which will no doubt have an influence on their state of mind, their social interactions and their mental health.
CCDH, in the report Deadly by Design, which it published last December, says:“TikTok operates through a recommendation algorithm that constructs a personalised endless-scroll ‘For You’ feed, ostensibly based on the likes, follows, watch-time, and interests of a user…
“TikTok identifies the user’s vulnerability and capitalises on it.”
CCDH has developed a parent guide to understanding the platforms that shape our children’s minds. It discusses the importance of talking openly about social media and showing interest in what their feed is showing them.
It’s time we stopped judging each other and turning a blind eye to what our children are looking at on their phones.
If we spent as much time in conversation with our children as they do on their devices, relationships would be healthier and parenting might be less challenging.
Let’s learn about social media from our children and find ways to share interests and to engage with them while ensuring they are kept safe online as well as offline.
A link to the CCDH parent guide and to the interview with Imran Ahmed can be found on the PaJeS website: pajes.org.uk/parent