A few months ago, Tik Tok launched a global project to better understand the relationship of young people with potentially dangerous challenges and hoaxes .
These contents and the behaviors that derive from them are not limited to Tik Tok, but in recent times the dangers spread on this platform have gained considerable relevance, at least as far as media attention is concerned.
To find out the scope of the problem, so to speak, Tik Tok surveyed more than 10,000 teenagers., parents and teachers from Argentina, Australia, Brazil, Germany, Italy, Indonesia, Mexico, United Kingdom, United States and Vietnam. With these data, I commissioned the independent agency Praesidio Safeguarding to write a report of conclusions and recommendations (signed by Dr. Zoe Hilton).
That text, in turn, was reviewed and expanded by a group of 12 youth safety experts from around the world and, after that, Tik Tok teamed up with Dr. Richard Graham, a clinical child psychiatrist specializing in the healthy development of adolescents. , and Dr. Gretchen Brion-Meisels, a behavioral scientist specializing in adolescent risk prevention.
The report
The result of all this is a report entitledAnalysis of effective preventive educational responses to dangerous challenges online , which not only analyzes the risks that young people face in the face of this content and how adults can help them face them, but also dissects the role of the media in the dissemination of supposed challenges that are not such, multiplying their potential danger among users who may thus want to join what is perceived as an attractive fashion.
After a detailed study of all these variables, the text concludes with a series of recommendationsfor parents and guardians, teachers and actors in the public sphere (this includes the media) to help them detect, combat and report content that can have particularly harmful effects on the most vulnerable users, including adolescents.
Not all challenges are harmful
As Tik Tok stressed in the online session launching the report, most challenges are safe and fun, and some are even useful (the 2014 #IceBucketChallenge helped promote awareness of ELA and the #BlindingLightsChallenge , or family unity challenge, helped do just that).
But not everyone is so supportive or innocuous. There are darker ones with more uncertain origins and purposes -like that ofthe Blue Whale or Momo’s -, which make a special dent in younger users, who according to the study by doctors Graham and Brion-Meisels are in a vital period that is usually associated with greater risk taking.
How adolescents assess risk
In the study, adolescents were asked to describe the level of risk of an online challenge they had recently seen. Almost half (48%) believed that the challenges they had visualized were safe, classifying them as fun or light-hearted.
32 % of those surveyed did include some risk , but still qualified as safe, and 14% were defined as risky and dangerous. 3% of the online challenges were described as very dangerous and only the0.3% of adolescents said they had participated in a challenge that they qualified as very dangerous.
What methods do teens use to understand the risks of challenges before participating in them
For example, they watch videos of other people participating in them, read comments and talk with friends.
One of the key steps in keeping teens safe is providing guidance on how to assess potential risks. Nearly half (46%) of teens said they would like “good risk information more broadly” and information “about what it takes to go too far .”
Parental concerns about the impact of hoaxes
Undoubtedly the most dangerous hoaxes (whether disguised as challenges or not) are those related to suicide and self-harm , which try to convey negative ideas that are not true.
The hoaxes of this type usually, explains the study, share similar features and false warnings have previously circulated about inciting children to participate in games that ended in self-harm.
Once spread, these hoaxes are largely spread through warning messages that encourage others to alert as many users as possible.
Although the dissemination of these warnings may seem harmless, research reveals that 31% of adolescents exposed to these hoaxes have experienced a negative impact.Of these, 63% said the negative impact was on their mental health.
Regarding the role of parents and guardians, the truth is that they do not feel safe when talking to adolescents about these hoaxes. They worry that if they mention the name of the alleged dare, their child may pick up on potentially harmful behavior that he or she didn’t know about.
More than half (56%) of parents said they would not mention a hoax related to self-harmunless a teen had mentioned it first, and 37% of parents found it difficult to talk about hoaxes without sparking some interest in them.
Bolster Protection Efforts
Research has shown how warnings about self-harm hoaxes — even if shared with the best of intentions — can affect teen well-being. Tik Tok has recently removed and taken steps to limit the spread of these hoaxes, and will now start removing alarmist warnings as well, as they could cause harm by treating the self-harm hoax as real.
Tik Tok, the company points out, has used the conclusions of Dr. Hilton’s report to review its policies and processes and is increasing its efforts to introduce improvements that strengthen the security of the platform.
In the words of Alexandra Evans , head of Public Security Policies for Tik Tok in Europe, “Having strong policies is an important part of our work to protect the community, and it is essential that these policies are accompanied by strong detection measures and application”.
Among those measures “are a technology that alerts our security teams to sudden increases in infringing content linked to labels,
For example, if a hashtag like #FoodChallenge (normally used to share food recipes) detected a spike in content that violated the platform’s policies, the team would receive an alert.
New Resources to Support the TikTok Community
One of the report’s main findings is that teens, parents, and educators need more information about the challenges and hoaxes.
To do this, Tik Tok has worked with Dr. Graham, Dr. Brion-Meisels and Anne Collier (The Net Safety Collaborative) to develop improvements to its security center, where caregivers will find advice to answer questions.
In addition, and in collaboration with Graham and Brion-Meisels, the platform hasimproved the language of the warning labels that will appear to people who try to search for content related to challenges or harmful hoaxes.
A new notice will encourage members of the web community to visit the Safety Center and, should people search for hoaxes related to suicide or self-harm, additional resources will be displayed in the search.