TikTok is being confronted with a billion-dollar lawsuit in the U.K. over their data collection practices for children under the age of 13. The popular video-sharing app has been sued by a former U.K. children’s commissioner on behalf of an unnamed 12-year-old girl with worries about how her data was being used.
The lawsuit claims that TikTok does not clearly outline for underage users or their parents how data is being collected. It has evolved into a class-action suit with millions of child claimants, where the complainant is seeking billions of pounds in restitution.
This is not the first time that privacy protections for minors have been scrutinized. There have been lawsuits against Disney, Viacom, Kiloo and others regarding how they collect children’s data for marketing purposes. The resolution often results in a monetary fine for the corporation. However, when a company such as TikTok has a net worth of $50 billion, fiscal punishments are temporary and do not facilitate lasting change in data privacy management. Larger shifts must be made to regulate data collection of minors beyond a pop-up asking the user’s age when one registers for an account.
TikTok has a serious issue with transparency when gathering information from its younger users. It acquires the child’s location and biometric data, videos, phone numbers and more without warning. Both parents and children are usually unaware that this data is being collected.
TikTok and its parent company, ByteDance, reaped billions in profits from advertising revenue. Users’ data, especially that of minors, should not be disingenuously obtained and utilized for fiscal gain. People have a right to know and decide what data is being captured and how it is being used.
Companies will frequently approach this with simple age verification. The measures rely on the assumption that the user is acting in good faith. Realistically, any savvy middle schooler can set their birth year to say they are older than 13 and/or click “yes” when asked if they are old enough to create an account.
Age verifications are a cop-out for companies to do the bare minimum when complying with regulations. Even with those “verifications” in place, 44% of eight to 12-year-olds still use TikTok in the U.K. In the U.S., it’s estimated that a third of TikTok users are under 14.
It becomes challenging to address the data collection of users under 13 because it is easy for them to lie on their profile. Instead of encouraging children to find alternative ways to freely use TikTok, social media companies should make clear the importance of accurate age reporting. Rather than making these platforms only suitable for teenagers and adults, the app should make the platform safe for all ages and allow younger children to register. When users under 13 create an account, TikTok should automatically register it as a child’s profile and input filters to censor explicit and inappropriate content.
To address data capture, social media platforms should be required to notify the user asking permission to collect their data. One way of implementing this is through the App Store.
Apple recently released iOS 14.5, which includes App Tracking Transparency (ATT). This requires apps to request permission to track and share the user’s data and activities. It gives people the ability to opt-in or opt-out. These pop-ups requesting tracking capabilities also include a statement explaining why the company wants to track the user’s information.
ATT is a major step in consumer data privacy protections, and it has the potential to be a lasting change that gives people more control over their information. ATT should be implemented by all operating systems and platforms. This, paired with child-tailored filters and content blockers, will give parents peace of mind and lead to more transparent data-gathering techniques. It is about time that the punishment for conglomerates goes beyond meager fiscal fines. It must venture into lasting policies that force these platforms to change how they deal with consumer data.