top of page
  • Writer's pictureKirra Pendergast

The Illusion of Privacy. Why Banning TikTok Won't Solve the Larger Problem

Updated: Apr 5, 2023


Focusing on TikTok's Chinese ownership diverts attention from a broader issue concerning social media apps and their functionality. This problem is not limited to a single app but affects all applications and services, amassing user data.

This mindset resembles saying that due to some faulty products, only local manufacturers should be allowed to sell products, banning foreign ones.


Prohibiting an app based on its country of origin fails to address the overarching issue concerning all apps, irrespective of location or ownership.


Apps gather extensive amounts of information on us, such as browsing history, social media activity across all our platforms, and location data which is constantly updated. Ad-supported apps, free to use but generating revenue through advertisements, accumulate vast user data to develop detailed user profiles.


In Meta's Terms of Service (Facebook, Instagram, Whatsapp), under the section "Permissions You Give to Us," it states: "When you share, post, or upload content that is covered by intellectual property rights (like photos or videos) on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content"


This section of Meta's Terms of Service highlights that when you share content protected by intellectual property rights, you grant Meta a sublicensable, royalty-free license to use your information in various ways, including hosting, distributing, and modifying it globally. While this allows the platform to provide a more personalised experience, it raises privacy concerns for users who may not fully understand the extent of their content's usage and sharing and the rights they are granting to the platform. In simple terms, a "sublicensable, royalty-free license" means that when you share content on the platform, you allow Meta to use your content without paying you any fees. They can also grant permission to others (third parties) to use your content. "Hosting, distributing, and modifying it globally" means that Meta can store, share, and make changes to your content worldwide, across its various services and products.


This data is used for different purposes, such as improving the user experience and delivering personalised advertisements. While some find this beneficial, it presents considerable privacy concerns as the online world outpaces privacy laws. Existing laws provide insufficient protection against the sale or theft of private information by any party, including governments. Consequently, consenting to terms and conditions without fully understanding them enables any entity, like app developers or government agencies, to use the data for their own purposes.


The controversy surrounding TikTok underscores this issue. There are concerns that the Chinese government might exploit the app to spy on users and gather sensitive information. Although ByteDance, the app's parent company, denies such connections, these concerns have spurred calls for banning TikTok in Australia and other countries.


However, banning TikTok is a limited solution that overlooks the broader problem applicable to all apps, regardless of ownership. TikTok is not the only app accumulating vast user data. And China is not the only country that MAY be using the data from apps.


The controversy surrounding TikTok highlights the broader issue of data privacy and surveillance concerns that extend to other countries, including the United States. Similar strategies are employed by various governments to access user data, with legal provisions allowing them to do so.


For example, the Foreign Intelligence Surveillance Act (FISA) allows the government to access user data for national security purposes in the United States. Under Section 702 of FISA, the US government can collect and use the communications of non-US persons outside the United States, which may incidentally include data from US citizens. Additionally, the USA PATRIOT Act expands the government's ability to conduct surveillance on US citizens and non-citizens, often without a warrant. In the UK, the Investigatory Powers Act (IPA) 2016 has been a controversial legislation that governs surveillance and data collection by law enforcement agencies. The IPA provides authorities with broad powers to access communications data, including internet browsing history, and intercept communications. The act also requires telecommunications companies to store communication data for up to 12 months, which authorities can access without a warrant in certain circumstances. The legislation has been criticized for its lack of transparency, and concerns have been raised regarding its impact on privacy and civil liberties. However, the UK government has defended the legislation as necessary for national security and public safety.


In the European Union, the General Data Protection Regulation (GDPR) governs data protection and privacy for individuals within the EU. However, the legislation allows for certain exceptions in the interest of public security, defense, and law enforcement. For instance, EU member states may introduce specific provisions to address national security concerns, enabling them to access user data in specific circumstances.


Australia has its own legislation, such as the Telecommunications (Interception and Access) Act 1979, which permits law enforcement agencies to access metadata and intercept communications in specific situations, subject to obtaining a warrant.


Regardless of location or ownership, ad-supported apps collect and use user data in obscure and manipulative ways.


Another thing to consider is the data collection practices of daycare centres and primary schools that use "journaling" or "compliance" apps. I recently submitted a paper on this subject, which has been accepted for presentation at the European Digital Education Network Conference in Dublin this June. My paper examines how many educational institutions now depend on apps that collect and store vast amounts of information on children, encompassing sensitive data like health records, daily routines, and developmental milestones. These apps provide compliance data, convenience, and enhanced communication for parents but also pose considerable risks to children's privacy and security. Addressing crucial questions regarding data storage, protection, access, and deletion is of paramount importance in mitigating these risks.


Given the sensitive nature of the data involved, there is a pressing need for increased efforts to protect children's privacy. One crucial step is prioritising the development of robust regulations and guidelines for these apps, starting with those used in daycare centres.


The terms and conditions of numerous daycare and primary school apps frequently include clauses that grant a "sublicensable, royalty-free license to use your information in various ways, such as hosting, distributing, and modifying it worldwide." Often, parents agree to these terms without fully comprehending the implications, unintentionally giving away their children's data without proper knowledge of the potential consequences. This situation urgently needs improvement. Parents must be thoroughly informed during enrollment and given the choice to consent or refuse without fear of discrimination, such as their child being excluded from photos. Schools and daycare centers should adopt alternative communication methods, such as emailing information to parents, instead of relying on apps that expose children's photos and personal data, like birthdates, in group posts. Additionally, there is a need to address the data collection practices of these apps.

Parents play a vital role in safeguarding their children's privacy. They should carefully review the privacy policies of these apps before signing and limit the amount of personal information disclosed.


In the larger scheme of things, it is essential to establish more robust data protection laws and regulations to guarantee privacy and data security for all apps, regardless of ownership, location, or the possibility of government access.


Implementing such measures could involve:

  • Introducing stricter data breach notification requirements.

  • Imposing limits on the sale and sharing of personal information in ad-supported apps.

  • Enforcing higher penalties for companies that fail to safeguard user data.

Currently, I spend part of the year in Italy, where I am at the moment. For example, Italy has taken action against ChatGPT by banning access due to a suspected violation of the General Data Protection Regulation (GDPR), which governs data protection and privacy for individuals within the European Union. Meanwhile, Australia has yet to introduce comparable measures. However, the potential ban on TikTok (whether it materializes or not) could serve as a first step towards more stringent data protection regulations in the country.


610 views

Recent Posts

See All
bottom of page