TikTok was fined £12.7 million by the UK’s data watchdog for failing to safeguard children’s privacy.
It is estimated that TikTok will enable up to 1.4 million UK children under the age of 13 to use the platform by 2020.
According to an investigation by the Information Commissioner’s Office, the video-sharing site used the data of children of this age without parental permission. (ICO).
TikTok stated that it had “heavily invested” in preventing under-13s from viewing the site.
According to the ICO, despite TikTok’s requirement that users be 13 years old to establish an account, many people were able to access the site.
It stated that children’s data could have been used to track and profile them, possibly exposing them to harmful or inappropriate material.
“There are laws in place to ensure our children’s safety in the digital world as they are in the physical world,” said information commissioner John Edwards.
“TikTok did not follow these rules.”
“As a result, an estimated one million under-13s were improperly granted platform access, with TikTok collecting and using their personal data.”
“TikTok should have been smarter.”
TikTok could have performed better. Our £12.7m fine shows the serious consequences of their failures.”
Later, he admitted that TikTok had “taken no steps” to acquire parental consent.
“When you sign up, you can be targeted for advertising, profiled, and your data is fed into an algorithm that feeds content,” he explained.
“If you’ve been viewing content that isn’t appropriate for your age, it can become increasingly extreme.”
“It can be quite dangerous for people who aren’t old enough to fully understand the implications and make appropriate decisions.”