An investigation by the U.K.’s data protection watchdog was announced to be underway, concerning how TikTok manages and utilized the personal data of teenagers to tailor the content recommended to them on the platform, with the concern that the algorithm could recommend harmful content to minors as a result. Reddit and Imgur are also part of this current investigation into the usage of children’s data.
“What I am concerned about is whether they are sufficiently robust to prevent children being exposed to harm, either from addictive practices on the device or the platform, or from content that they see, or from other unhealthy practices,” information commissioner John Edwards said.
ByteDance, the firm that operates TikTok, assured in a statement that their practices follow the ethical standards brought into question, stating that their systems “operate under strict and comprehensive measures that protect the privacy and safety of teens.”
This comes nearly two years after the same watchdogs levied a £12.7 million fine against TikTok for not properly removing children under the age of 13, the stated minimum age on TikTok, from the platform, claiming as many as 1.4 million were under the minimum age.
Due to these children gaining access to the platform, according to Edwards, TikTok collected their data with the potential to deliver harmful content via the recommendation algorithm. The watchdog also stated that TikTok had failed to ensure that the data of British users was processed lawfully.