Child rights groups slam TikTok’s ‘design discrimination’ – TechCrunch


A study examining the default settings and terms and conditions offered to minors by social media giants TikTok, WhatsApp and Instagram in 14 different countries – including the US, Brazil, Indonesia and the UK – found that the three platforms do not offer the same level of privacy and security. protections for children in all the markets in which they operate.

The level of protection minors receive from a service may depend on where in the world they live, according to the new report – titled: Global platforms, partial protections — who found “significant” variation in the experience of children in different countries on “seemingly identical platforms”.

The research was conducted by Fairplay, a non-profit organization that advocates for an end to marketing aimed at children.

TikTok has proven particularly problematic in this regard. And, alongside the release of Fairplay’s report, the company was singled out in a joint letter, signed by nearly 40 child safety and digital rights advocacy groups, calling on it to come up with an approach ” Safety By Design” and “Children’s Rights by Design”. globally – rather than only providing the highest standards in regions like Europe, where regulators have taken early action to protect children online.

Citing information in Fairplay’s report, the 39 child protection and digital rights organizations from 11 countries – including the UK’s 5Rights Foundation, Tech Transparency Project, Africa Digital Rights Hub in Ghana and the Eating Disorders Coalition for Research, Policy & Action, to name a few – co-signed the letter to TikTok CEO Shou Zi Chew urging him to address the key design discriminations highlighted by the report.

These include deviations in where TikTok offers an “age-appropriate” design experience for minors, such as the default configuration of private settings (as is the case in the UK and some EU) – while elsewhere it has been found to default to 17-year-old users. to public accounts.

The report also identified many (non-European) markets where TikTok does not provide its terms of service in the native language of young people. He also criticizes the lack of transparency around minimum age requirements – finding TikTok sometimes provides users with conflicting information, making it difficult for minors to know if the service is right for them.

“A lot of young TikTok users are not European; TikTok’s largest markets are in the United States, Indonesia, and Brazil. All children and young people deserve an age-appropriate experience, not just those from Europe,” say the authors of the report.

Fairplay’s research methodology involved central researchers, based in London and Sydney, analyzing the platforms’ privacy policies and terms and conditions, with support from a global network of local research organizations – which included the creation of experimental accounts to explore variations of the proposed default settings. to 17-year-olds in different markets.

The researchers suggest their findings challenge the claims of social media giants to care about child protection, as they clearly do not provide the same standards of security and privacy to minors everywhere.

Instead, social media platforms appear to be taking advantage of gaps in the global patchwork of legal protections for minors to prioritize business goals, like boosting engagement, over the security and privacy of minors. children.

Notably, children in the Global South and some other regions are exposed to more manipulative design than children in Europe – where legal frameworks have already been enacted to protect their online experience, such as the Age Appropriate Design Code of the United Kingdom (in force since September 2020); or the European Union’s General Data Protection Regulation (GDPR), which began to apply in May 2018 – requiring data processors to take extra care to build in safeguards when services process information about minors, with the risk of significant fines for non-compliance.

Asked to summarize the research findings in one line, a Fairplay spokeswoman told TechCrunch, “In terms of a one-line summary, it’s that the regulations work and technology companies do not operate without it.She also suggested that it’s correct to conclude that a lack of regulation makes users more vulnerable to “the vagaries of the platform’s business model.”

In the report, the authors make a direct appeal to lawmakers to put in place frameworks and policies that provide “the best protection for the well-being and privacy of young people.”

The report’s findings are likely to add to calls for lawmakers outside Europe to step up their efforts to pass legislation to protect children in the digital age – and avoid the risk that platforms focus their most discriminatory and predatory behaviors on miners living in markets that lack legal control over commercial default ‘datafication’.

In recent months, California lawmakers have sought to adopt an age-appropriate design code of British style. While earlier this year a number of US senators proposed child online safety legislation as the issue of child online safety gained more attention – although the passage of Federal privacy legislation of any kind in the United States continues to be a major challenge.

In a supporting statement, Rys Farthing, report author and researcher at Fairplay, noted, “It is troubling to think that these companies are picking and choosing which young people to offer the best security and privacy protections to. It’s reasonable to expect that once a company figured out how to make their products for kids a little better, they would roll it out universally for all young people. But again, social media companies are failing us and continuing to engineer unnecessary risks into their platforms. Lawmakers need to step in and pass regulations that force digital service providers to design their products in a way that is suitable for young people.

“Many jurisdictions around the world are exploring this type of regulation,” she also noted in remarks accompanying the release of the report. “In California, the Age Appropriate Design Code, which is before the State Assembly, could ensure that some of these risks are eliminated for young people. Otherwise, you can expect social media companies to offer them second-rate privacy and security.

When asked why Meta, which owns Instagram and WhatsApp, was not also receiving a critical letter from advocacy groups, the Fairplay spokeswoman said its researchers found TikTok to be “by far the platform worst performing” – hence the co-signers felt “the greatest urgency” to focus their advocacy there. (Although the report itself also addresses issues with the two Meta-owned platforms.)

“TikTok has over a billion active users, and various global estimates suggest that between a third and a quarter are underage. The decisions your company makes about security and privacy have the capacity to affect 250 million of young people around the world, and these decisions must ensure that the best interests of children and young people are achieved, and achieved equally,” the advocacy groups write in the letter.

“We urge you to adopt a Safety By Design and Children’s Rights by Design approach and immediately undertake a global risk assessment of your products to identify and address privacy and security risks on your platform. When a practice or local policy is found to maximize the safety or privacy of children, TikTok should adopt it globally All young TikTok users deserve the strongest protections and privacy, not just children European jurisdictions where regulators have taken early action.

While European lawmakers may have reason to feel a little smug in light of the relatively higher standard of Fairplay seeker protection offered to children in the region, the key word is relative: even in Europe – a region considered the de facto global leader in data protection standards – TikTok has in recent years faced a series of complaints over child safety and privacy; including class action lawsuits and regulatory investigations into how it handles children’s data.

Child safety criticism of TikTok in the region persists – particularly given its extensive profiling and targeting of users – and many of the aforementioned legal actions and investigations remain ongoing and not resolved, even if new concerns surface.

Just this week, for example, Italy’s data protection agency sounded the alarm over a planned change to TikTok’s privacy policy that it says doesn’t comply with existing privacy laws. the EU on the protection of privacy – by issuing a formal warning. He urged the platform not to persist with a change it said could have troubling ramifications for minors on the service who may see inappropriate “personalized” ads.

In 2021, the Italian authority also intervened following child safety concerns which it said were linked to a TikTok challenge – ordering the company to block users whose ages it could not verify. TikTok then deleted more than half a million accounts in the country that it said it was unable to confirm were not at least 13 years old.

Tech

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button