EDITORIAL: TikTok, Apps and the Inherent Value of Children’s Privacy Online September 7, 2022


At least for very small children, parents are responsible for making decisions about their online presence and their privacy. Parents decide how much screen time their children have, the apps their children play with, and even their children’s online social media profile, such as TikTok or Instagram. In making these decisions, parents should, and most do, strive to act in the child’s best interests. To act in the best interests of the child in making decisions about their engagement with digital technology, parents should think about the technology involved, the risks and benefits that accompany it, and, I suggest, the inherent value of privacy.  

In this minefield of new technology, new platforms, and new information all the time, what is best for our children? What kind of technology is actually being used? And how important is privacy, really? 

Many parents will be aware of the advice provided by the e-safety commissioner about precautions to be taken in posting images of children online, including avoiding visual hints as to location, clearing metadata and removing images of other children, whose parents have not agreed to them being included in the post. However, there are other considerations also to take into account. We might not think tracking is a problem, and it might seem far-fetched on commonly used and popular social media platforms or apps. But there are many ways to collect information and reuse images, not all of them including hidden cameras or location settings. 

Every digital interaction creates data, possibly collected by multiple companies, linked to digital identity and analysed by algorithms to make predictions about preferences and behaviours. In profiling, moreover, demographic information, such as name, birthday and address, that parents often choose not to disclose online about their children, may not be as important as other factors, such as behavioural traits, interests, preferences factors, time spent on particular websites, locations and even biometric identifiers. 

In 2021, Niels Wouters and I raised concerns about TikTok’s decision to include a right to collect biometric data in its privacy terms for US users. Biometric data consists of the unique features of faces, irises, and even gait. TikTok has dropped those terms, but its privacy policy still allows it to collect data about users’ keystrokes patterns and rhythms, and concerns have recently been raised about the cybersecurity implications of this capacity. TikTok is not alone in raising concerns about overreaching data collection practices. For example, CMA recently reported the significant number of children’s apps that allow extensive amounts of data to be collected about their child users, making the apps susceptible to hacking. 

We might assume children are somehow protected or insulated from the decisions of the adult world. However, the data collected by companies about children now does not go away or reset once they reach adulthood. Instead, data collected about children remain linked to their growing online profiles, and we do not know the consequences of this trend. As a result, children and young people now are the first generations to be extensively digitally tracked, monitored and profiled online for their entire lives. 

What we do know is that digital profiles are increasingly being used to influence children’s consumption decisions, for example, towards unhealthy foods, gambling or alcohol. This information may also be used in the automated decision-making systems that will determine children’s future access to jobs, credit and insurance. For precisely these reasons, there is currently law reform occurring across the world to build stronger data rights, and even, in some countries, bans on profiling and automated decision-making about children.

Even seemingly innocuous images shared online are there for good. Children’s digital presence can affect their online and real-world behaviours and fuel dangerous practices or harmful trends. Moreover, firms such as Clearview AI, which sells facial recognition technology to law enforcement agencies and commercial businesses, have developed their products by scraping images of faces and other identifying information from the internet, a practice contrary to the advice of Australia’s Privacy Commissioner. These considerations raise risks about future impacts, something hard to factor into decisions. 

But another critical consideration in making decisions about children’s online social media presence and digital interactions is the value of privacy itself. 

Privacy is a fundamental human right. For all of us, especially children, privacy allows space to experiment, create and develop our beliefs and values. 

Privacy, and freedom from an unrelenting public gaze or the scrutiny of commercial firms, are something to be nurtured. 

Parents interested in their children’s well-being may wish to pay attention to the value of this right and not lightly squander it away.

Jeannie Marie Paterson is a Professor of Law at the Centre of AI and Digital Ethics at the University of Melbourne