TikTok Under Investigation Over Child Sexual Abuse Material
According to reports, the US Department of Homeland Security has initiated an investigation into TikTok's handling of content involving child sexual abuse and the moderation procedures in place. The agency is investigating the alleged abuse of TikTok's "Only Me" function, which was allegedly exploited to transmit problematic content, according to the Financial Times, which claims to have verified the allegations in collaboration with child safety organizations and law enforcement officials.
Users can keep their TikTok videos without posting them online using the Only Me function. When a video's status is set to Only Me, it can only be viewed by the account's owner. Bad actors were given access to the credentials of accounts that disseminated content portraying Child Sexual Abuse Imagery (CSAM) on TikTok. As a result, the nasty films were never made public and were never detected by TikTok's moderation system.
Seara Adair, a child safety activist, was reported as stating:
“TikTok talks constantly about the success of their artificial intelligence, but a naked child is slipping through it."
In March of this year, the federal agency, citing data security concerns, banned TikTok from all networks, including phones and laptops held by the department's information technology systems.
This isn't the first time TikTok has been the subject of a major investigation. Between 2019 and 2021, the number of investigations by the Department of Homeland Security into the dissemination of child exploitation content on TikTok reportedly increased sevenfold. Despite making grandiose claims about strict policy enforcement and severe action against abusive content featuring children, it appears that the platform's criminal actors are still thriving.
The dangers are growing, according to media regulator Ofcom, which claims that 16 percent of toddlers aged three to four years consume TikTok videos. According to the National Society for the Prevention of Cruelty to Minors (NSPCC) in the United Kingdom, online grooming offenses hit a new high in 2021, with children being especially vulnerable. Although predators prefer Instagram and Snapchat, tales of terrible child grooming on TikTok have surfaced online on several occasions in recent years.
And this is also not the first time TikTok has gotten a lot of attention for the wrong reasons. Last month, two former TikTok content moderators sued the firm, alleging that it failed to provide necessary support while they dealt with severe content showing "child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder."
Predators targeted children as young as nine years old with lewd comments and proposals, according to a BCC investigation from 2019. The information commissioner of the United Kingdom, Elizabeth Denham, started an investigation against TikTok in the same year, citing the platform's handling of personal data belonging to underage users. And, given its enormous popularity among young users, uninstalling it isn't quite as simple as deleting Facebook.
TikTok has recently implemented safeguards to protect its young user base. Strangers will no longer be able to contact TikTok accounts belonging to minors under the age of 16, and their accounts will default to private, TikTok stated last year. The short video-sharing platform has also strengthened its download restrictions for videos shared by individuals under the age of 18.
Last year, TikTok added tools to its platform to assist sexual assault survivors, including specialists from the Rape, Abuse & Incest National Network (RAINN) and easy access to the National Sexual Assault Hotline.