Research examining default settings and terms & conditions offered to minors by social media giants TikTok, WhatsApp and Instagram across 14 different countries – including the US, Brazil, Indonesia and the UK – has found the three platforms do not offer the same level of privacy and safety protections for children across all the markets where they operate.
The level of protection minors receive on a service can depend upon where in the world they happen to live, according to the new report – entitled: Global Platforms, Partial Protections – which found “significant” variation in children’s experience across different countries on “seemingly identical platforms”.
The research was conducted by Fairplay, a not-for-profit which advocates for an end to marketing that targets children.
TikTok was found to be particularly problematic in this regard. And, alongside publication of Fairplay’s report, the company has been singled out in a joint letter, signed by almost 40 child safety and digital rights advocacy groups, calling on it to offer a “Safety By Design” and “Children’s Rights by Design” approach. globally – rather than only providing the highest standards in regions like Europe, where regulators have taken early action to safeguard kids online.
Citing information in Fairplay’s report, the 39 child protection and digital rights advocacy organizations from 11 countries – including the UK’s 5Rights Foundation, the Tech Transparency Project, the Africa Digital Rights Hub in Ghana and the Eating Disorders Coalition for Research, Policy & Action, to name a few – have co-signed the letter to TikTok CEO, Shou Zi Chew, urging him to address key design discriminations highlighted by the report.
These include discrepancies in where TikTok offers an “age appropriate” design experience to minors, such as defaulting settings to private (as it does in the UK and certain EU markets) – whereas, elsewhere, it was found defaulting 17-year-old users to public accounts.
The report also identified many (non-European) markets where TikTok fails to provide its terms of service in young people’s first language. It is also critical of a lack of transparency around minimum age requirements – finding TikTok sometimes provides users with contradictory information, making it tricky for minors to know whether the service is appropriate for them to use.
“Many of TikTok’s young users are not European; TikTok’s biggest markets are in the United States, Indonesia and Brazil. All children and young people deserve an age appropriate experience, not just those from within Europe, ”the report authors argue.
The methodology for Fairplay’s research involved central researchers, based in London and Sydney, analyzing platforms’ privacy policies and T & Cs, with support from a global network of local research organizations – which included the setting up of experimental accounts to explore variations in the default settings offered to 17-year-olds in different markets.
The researchers suggest their findings call into question social media giants’ claims to care about protecting children – since they are demonstrably not providing the same safety and privacy standards to minors everywhere.
Instead, social media platforms appear to be leveraging gaps in the global patchwork of legal protections for minors to prioritize commercial goals, like boosting engagement, at the expense of kids ’safety and privacy.
Notably, children in the global south and certain other regions were found to be exposed to more manipulative design than children in Europe – where legal frameworks have already been enacted to protect their online experience, such as the UK’s Age Appropriate Design Code (in force since September 2020); or the European Union’s General Data Protection Regulation (GDPR), which begin being applied in May 2018 – requiring data processors to take extra care to bake in protections where services are processing minors ’information, with the risk of major fines for non-compliance.
Asked to summarize the research conclusions in a line, a spokeswoman for Fairplay told TechCrunch: “In terms of a one line summary, it’s that regulation works and tech companies don’t act without it.”She also suggested it’s correct to conclude that a lack of regulation leaves users more vulnerable to“ the whims of the platform’s business model ”.
In the report, the authors make a direct appeal to lawmakers to implement settings and policies that provide “the most protection for young people’s wellbeing and privacy”.
The report’s findings are likely to add to calls for lawmakers outside Europe to amp up their efforts to pass legislation to protect children in the digital age – and avoid the risk of platforms concentrating their most discriminatory and predatory behaviors on minors living in markets which lack legal checks on ‘datafication’ by commercial default.
In recent months, lawmakers in California have been seeking to pass a UK-style age appropriate design code. While, earlier this year, a number of US senators proposed a Kids Online Safety Act as the child online safety issue has garnered more attention – although passing federal-level privacy legislation of any stripe in the US continues to be a major challenge.
In a supporting statement, Rys Farthing, report author and researcher at Fairplay, noted: “It’s troubling to think that these companies are picking and choosing which young people to give the best safety and privacy protections to. It’s reasonable to expect that once a company had worked out how to make their products a little bit better for kids, they’d roll this out universally for all young people. But once again, social media companies are letting us down and continue to design unnecessary risks into their platforms. Legislators must step in and pass regulations that compel digital service providers to design their products in ways that work for young people. ”
“Many jurisdictions around the world are exploring this sort of regulation,” she also pointed out in remarks to accompany the report’s publication. “In California, the Age Appropriate Design Code which is in front of the state Assembly, could ensure some of these risks are eliminated for young people. Otherwise, you can expect social media companies to offer them second-rate privacy and safety. ”
Asked why Meta, which owns Instagram and WhatsApp, isn’t also being sent a critical letter from the advocacy groups, Fairplay’s spokeswoman said its researchers found TikTok to be “by far the worst performing platform” – hence the co-signatories felt “the greatest urgency ”to focus their advocacy on it. (Although the report itself also discusses issues with the two Meta-owned platforms as well.)
“TikTok has over a billion active users, and various global estimates suggest that between a third and quarter are underage. The safety and privacy decisions your company makes has the capacity to affect 250 million young people globally, and these decisions need to ensure that children and young people’s best interests are realized, and realized equally, ”the advocacy groups write in the letter.
“We urge you to adopt a Safety By Design and Children’s Rights by Design approach and immediately undertake a risk assessment of your products globally to identify and remedy privacy and safety risks on your platform. Where a local practice or policy is found to maximize children’s safety or privacy, TikTok should adopt this globally. All of TikTok’s younger users deserve the strongest protections and greatest privacy, not just children from European jurisdictions where regulators have taken early action. ”
While European lawmakers may have cause to feel a bit smug in light of the relatively higher standard of safeguarding Fairplay’s researchers found being offered to kids in the region, the key word there is relative: Even in Europe – a region that’s considered the de facto global leader in data protection standards – TikTok has, in recent years, faced a series of complaints over child safety and privacy; including class action style lawsuits and regulatory investigations into how it handles children’s data.
Child safety criticisms of TikTok in the region persist – especially related to its extensive profiling and targeting of users – and many of the aforementioned legal actions and investigations remain ongoing and unresolved, even as fresh concerns are bubbling up.
Back in 2021, Italy’s authority also intervened following child safety concerns it said were linked to a TikTok challenge – ordering the company to block users it could not age verify. TikTok went on to remove over half a million accounts in the country that it said it was unable to confirm were not at least 13-years-old.