TikToker shows off easy trick to hide bra straps
We use your sign-up to provide content in ways you’ve consented to and to improve our understanding of you. This may include adverts from us and 3rd parties based on our understanding. You can unsubscribe at any time. More info
A study conducted by the Center for Countering Digital Hate (CCDH) has suggested that TikTok is “pushing harmful content into teenagers’ feeds” warning that it could encourage eating disorders, self-harm and suicide. The research carried out by the online safety forum found that certain accounts were repeatedly being served content around eating disorders and other harmful topics mere minutes after joining the platform.
Two accounts in each of these countries – US, UK, Australia and Canada – posing as 13-year-olds were created by the group to carry on the research.
One account in each country was given a female name and the other was given a similar name but with a reference to losing weight included in the username.
The content served to both accounts in their first 30 minutes on TikTok was then compared.
Imran Ahmed, chief executive of the CCDH, accused TikTok of “poisoning the minds” of younger users.
He said: “It promotes to children hatred of their own bodies and extreme suggestions of self-harm and disordered, potentially deadly, attitudes to food.
“Parents will be shocked to learn the truth and will be furious that lawmakers are failing to protect young people from big tech billionaires, their unaccountable social media apps and increasingly aggressive algorithms.”
When setting up the accounts, the researchers interacted with any harmful content they encountered, by liking any videos that contained content relating to self-harm, eating disorders or suicide.
This indicated to TikTok’s algorithm that these were subjects the user was interested in.
In a statement, a spokesperson for TikTok said the researchers’ activity and resulting experiences don’t “reflect genuine behavior or viewing experiences of real people”.
The CCDH also claimed that the accounts used in the study maintained a preference for videos about body image, mental health, and eating disorders.
The online safety group’s report warns that the sheer speed with which TikTok recommends content to new users is harmful.
During its test, the CCDH said one of its accounts was served content referencing suicide within three minutes of joining TikTok and eating disorder content was served to one account within eight minutes.
WHO chief said his uncle ‘murdered’ in Ethiopia in Tigray [REVEAL]
Peru’s former president denies charges of conspiracy and rebellion [INSIGHT]
Sam Bankman-Fried denied bail in ‘one of the biggest financial frauds’ [SPOTLIGHT]
It said on average, its accounts were served videos about mental health and body image every 39 seconds.
And the research indicated that the more vulnerable accounts – which included the references to body image in the username – were served three times more harmful content and 12 times more self-harm and suicide-related content.
The CCDH said the study had found an eating disorder community on TikTok which uses both coded and open hashtags to share material on the site, with more than 13 billion views of their videos.
The video-sharing platform includes a For You page, which uses an algorithm to recommend content to users as they interact with the app and it gathers more information about a user’s interests and preferences.
Source: Read Full Article