Singapore reins in app stores to protect the young
Source: Straits Times
Article Date: 27 Jan 2025
Author: Osmond Chia & Sarah Koh
Come March 31, the Infocomm Media Development Authority will roll out a new code requiring, among many things, that app stores screen and prevent users aged below 18 from downloading apps meant for adults, such as dating apps or those with sexual content. App stores have one year until March 2026 to roll out the measures.
Just as age verification is mandatory for pub entry, app stores will soon need to check the age of young users before allowing them to download apps for grown-ups.
Come March 31, Singapore’s media regulator will roll out a new code requiring, among many things, that app stores screen and prevent users aged below 18 from downloading apps meant for adults, such as dating apps or those with sexual content. App stores have one year until March 2026 to roll out the measures.
All other obligations spelt out by the Infocomm Media Development Authority (IMDA) in the Code of Practice for Online Safety for App Distribution Services will kick in on March 31, 2025.
For one thing, app stores must have community standards in place and enforce it among app developers, and also respond promptly to reports of violations flagged by its users.
The new code aims to set guard rails at the gateway to apps, which have come under fire for exposing children to all sorts of harmful content, including sexual and violent material and content linked to self-harm or cyber bullying.
The requirements of the code will apply to Apple, Google, Huawei, Samsung and Microsoft as they operate stores or online portals for downloading applications.
With the responsibility pinned on app stores to manage app developers, the authorities have a single point of contact to crack down on problematic apps that also contain, among other things, terrorism-related or child abuse material that could be exposed to children.
The new measures are an attempt to rein in app stores in a similar way to how social media platforms are required to provide restricted account settings and tools for parents to manage their children’s safety under Singapore’s Code of Practice for Online Safety, which took effect in 2023.
Rule flouters risk being fined up to $1 million or blocked under the Broadcasting Act, which was amended in 2023 to rein in social media platforms and app stores.
Pros of the new code
A key feature of the upcoming Code of Practice for Online Safety for App Distribution Services is age screening.
The logic is simple: If young users are not allowed to download apps not meant for them, the harm stops at the gate.
Institute of Policy Studies (IPS) adjunct senior research fellow Chew Han Ei said that app stores act as “gatekeepers”.
“By targeting the app stores, the code tackles a critical access point and introduces clearer responsibilities for regulating content,” he said.
The success of the law banks on the effectiveness of age verification methods.
App stores have until March 2026 to roll out the tools to screen the approximate age of users to prevent children from installing dating apps like Tinder, which is rated for ages 18 and above, and those under 12 from downloading TikTok or Instagram.
Today, few app stores enforce age-appropriate downloads. Many simply ask users to state a birth date before downloading a mature-rated app.
Lax age verification has been a headache for parents, said Madam Salizawati Abdul Aziz, 41, who has caught her own children trying to fake their age to download apps for grown-ups.
“I realised how inappropriate some of the apps can be so I started to control their access with family mode settings,” said Madam Saliza, a teacher with six children.
“By having this code,” she added, “app stores can help to restrict and block young users from downloading inappropriate apps and that would prevent unnecessary exposure to these content in the first place.”
The law also broadens the authorities’ powers to crack down on harmful user-generated content within apps, such as inappropriate conversations in games with strangers on chat functions or content posted on social media platforms.
The authorities can now knock on the doors of app stores, which are required to take action against app developers who fail to address users’ complaints about harmful content, or implement measures to spot and address harms. These actions include a ban or removal of the app from the store.
User-generated content has long been a pain point for parents who fret about what appears on their children’s feed.
Popular games like Minecraft and Roblox that are available on app stores have become hunting grounds for predators and bad company, who can target children through in-game chats.
Notably, a radicalised 16-year-old in Singapore had joined several Roblox servers, with maps depicting conflict zones occupied by real-world extremist groups.
The code of practice for app stores will work in tandem with 2023’s Code of Practice for Online Safety, which mandates social media platforms to address harmful content affecting children.
Similarly, app developers are required to monitor their platforms for online harms, provide tools for parents to manage their child’s safety, and respond promptly to user reports of harmful content.
Associate Professor Carol Soon of NUS’ Department of Communications and New Media said that by imposing similar duties on app stores, the authorities will add an extra layer of protection to shield children from harmful content on top of existing regulations.
Social media services were targeted under the Code of Practice for Online Safety, whereas the code for app stores targets all sorts of apps, she added.
Prof Soon, who is also the vice-chairwoman of the Media Literacy Council, said: “This new code enables another layer of mitigation at the system level as it targets app stores that act as users’ entry points to a wide range of services and products, many of which are not covered by existing regulations.”
Hurdles
One of the biggest hurdles to reining in harms online is how to tell if a user is a child.
IMDA has suggested two ways: The use of government-sanctioned identification or credit cards, or technology such as artificial intelligence (AI) and facial screening.
The jury is still out on how these will be implemented. Samsung and Google, for instance, said they are considering their options and in discussions with the authorities, which has given app stores till March 31, 2026, to roll out the measures.
Experts said government ID ranks among the most precise ways to check a person’s age but that it can be seen as heavy-handed as it would require significant changes and raise privacy concerns.
Tech adviser Josh Lee from legal firm Rajah and Tann said IDs are typically used for high-risk services.
App stores and the authorities will need to address privacy concerns, such as fears that the data collected could be used for other purposes, said Mr Lee, who is also a committee chairman at non-profit organisation Cyber Youth Singapore.
It is yet to be seen how far Singapore can influence app stores because businesses generally prefer consistency in global operations, as variations in their app could rack up costs, said Mr Lee.
Mr Glenn Gore, chief executive of identity solutions provider Affinidi, proposed that app stores and the authorities could utilise zero-knowledge proofs – a digital verification process to confirm that users are who they claim to be without revealing sensitive details – tapping national databases like Singpass.
The alternative to ID is facial screening technology, which typically taps the user’s smartphone front-facing camera or some form of AI, to gauge their age.
Experts said age-screening technology is prone to huge margins of error.
A 2024 study by the US National Institute of Standards and Technology on age estimation and verification methods found that the technology’s accuracy varied. Variations, such as camera quality or the user’s skin condition, colour and facial structure, can all affect the accuracy of readings, it found.
Associate Professor Terence Sim of NUS’ School of Computing said that age screening technology may reliably estimate broad age groups, such as people in their 20s or 30s, or children under 13.
But teenagers undergo significant physical changes, making precise age estimation much more challenging, he added.
“This is not a tech problem, it’s just biology. Some people just look younger or older than they are,” said Prof Sim.
Mr Campbell Cowie, head of policy at biometric solutions firm iProov, said platforms should use a mix of age screening technology and identification.
“Age estimation solutions are okay if it’s voluntary. It would just be a best effort,” said Mr Cowie. “But if we want something accurate to use as part of legislation, age verification is the only accurate option.”
He added that the authorities will need to spell out standards for reasonably assessing users’ ages to avoid inconsistencies between app stores, which could put users at risk of harmful content.
Even with facial scannings, there are ways a determined child can evade age checks.
Rajah and Tann’s Mr Lee said children who have access to their parents’ device could access such content since their device would likely be configured to an adult profile.
Children could get the assistance of an adult to pass the age checks and download an app on their own device, he said, adding that adults also play a key part in the issue.
Prof Sim said platforms need to go one step further and require checks in real time, instead of accepting a one-time approval at the point of registration.
For this to work, app stores should seek biometric authentication – a guarantee of explicit approval – each time a user wants to install a mature-rated app. The user’s account should also be linked to digital identification, so that the app store can be certain of the downloader’s age, Prof Sim added.
“You shouldn’t rely on passwords too. They are simply not secure enough as a child might already know the parent’s password, which might have been given to them during the phone’s set-up,” he said.
Experts and parents highlighted inconsistencies in how content is rated. For instance, some apps include films or user-generated content potentially unsuitable for children, even though the app is rated suitable for children on the app store.
These include streaming services like Netflix, which, despite carrying a 12+ age rating on app stores, hosts shows intended for mature audiences (18+), typically protected with a code to restrict access.
NUS’ Prof Soon said the mismatch boils down to age ratings for apps being submitted by app developers in most cases. But enforcing accuracy by app stores could be tough due to the sheer number of apps available, she added.
App developers can make content updates that might render the initial age rating invalid, which makes user reporting channels to flag inappropriate content crucial if developers have any hopes of keeping up.
Room for more
As five major app stores by the likes of Apple and Google figure out suitable tools to spot children, one of the biggest gaps is the omission of PC gaming marketplace Steam.
With millions of users globally, Steam is the PC gamer’s gateway to install games with explicit content such as Grand Theft Auto, and others that have come under scrutiny over user-generated content and forum interactions like Roblox.
Highly sexual titles, often labelled NSFW (not suitable for work), are a mainstay on Steam’s menus too – with some even free to play.
Yet, the platform has been criticised for its lax age verification, which simply asks users to declare a birth date before downloading a mature-rated game.
Prof Soon said there is a possibility that other platforms will fall under the code, which targets five dominant services in the market for the time being while the regulators figure out the most suitable age assurance methods.
The role of parents, too, cannot be understated in the quest to keep children away from online harms.
Experts said there is little stopping a parent from approving the installation of adult games, while others might not be aware of how children might try to game the system.
NUS’ Prof Sim suggested the use of notifications to parents as a way to deter children from installing applications for adults.
App stores should alert parents through e-mail or a prompt each time their child requests to install a mature-rated app, the same way Instagram alerts the account holder when there is a log-in on a new device, Prof Sim added.
This is feasible, since most young people buy devices with the help of their parents, who can in turn be required to set up the device beside their child and implement notifications and other guard rails, he said.
“It will make children think twice,” he said. “Parents also have an opportunity to talk to their children about the content they consume.”
Prof Sim added: “If your child had a drinking problem, you’d want to talk to them. It works the same online.”
‘She was trying to hide it’: Online harms lead to some parents wanting social media ban
When asset manager Look Ru Shin noticed her daughter, then aged 10, glued to her phone, something felt off.
“She was trying to hide it,” Madam Look, 46, recalled.
A closer look revealed that her daughter, Claire, had received inappropriate messages from a stranger on Minecraft. “I immediately deleted her account and the app,” she said.
The incident heightened Madam Look’s concerns about harms online that she had seen, such as toxic trends, warped beauty standards and sexualised content, and their detrimental impact on children.
Many parents stand with her in wishing to see more handholding when it comes to online usage for children, but are torn on what the most effective measures should be.
Regulators globally are at a crossroads with online platforms and harmful content that plagues social media, games and other apps.
Many nations, including Britain, European countries, Indonesia and Singapore are mulling over age restrictions – with some even considering outright bans on social media for teenagers.
Australia has led the charge, passing a ban that will kick in later in 2025, to outlaw social media for teenagers under 16.
Singapore’s Minister of State for Digital Development and Information Rahayu Mahzam said in Parliament in January that Singapore shares the same objectives as Australia in legislating age limits for social media access to protect young users and is in talks with the country over Australia’s approach.
The Republic’s approach – for now – targets app stores and app creators, who are required to follow standards under codes of practice to address harms on their platforms or face penalties like a fine of up to $1 million or be blocked here.
App stores have until March 2026 to set up technology that can check users’ age – a key measure in keeping children away from mature apps.
As Singapore evaluates the merits of a social media ban, the views among parents interviewed by The Straits Times are split over whether this will be effective. Some are in favour of a sweeping ban while others want to see brakes applied to online usage but wish to ease their children into the online world.
Ms Tammie Wong, 46, supports the ban, which she likened to having a law against underage drinking to protect teenagers from its harmful effects.
She said: “The advent of social media was so quick that its effects on our youth are unprecedented and unchartered. Many effects are surfacing only now.”
Ms Wong added: “Without a concerted and clear signal, the extent to how much parents can control kids’ usage is limited, especially when they reach teenhood.”
Dr Natalie Games, a clinical psychologist at Alliance Counselling, said that a ban works like scaffolding for parents, as it would usher in more protective measures around social media and support them in their struggle to manage their children’s use of social media.
More than half of her clients, especially teenagers, talk about social media in the context of struggles.
But a ban is just one step as digital literacy is still crucial to helping teenagers navigate the digital world, she said.
Online content has been blamed for encouraging harmful behaviour and mental health issues among young people.
Many are still reeling from the tragedy of British teenager Molly Russell, who took her own life in 2017 after being exposed to thousands of Instagram and Pinterest posts about depression, self-harm and suicide in the months leading up to her death.
Yet, young people today are still frequently exposed to suicide and self-harm content online. TikTok has been criticised for circulating viral challenges encouraging dangerous activities, such as ingesting chemicals or dangerously holding one’s breath.
Another parent, Madam Salizawati Abdul Aziz, 41, said she feels a sense of unease as her six children, aged between three and 17, are increasingly exposed to online services.
“Most alarming for me was that it is so easy to access videos with sexual content, and ones that feature women in a sexualised manner, like in skimpy clothes and doing things to entice viewers,” added Madam Saliza, a teacher.
“I was shocked that I could see these as I casually swiped through the Facebook shorts or IG stories. It got me thinking then how do I filter and prevent my young children from viewing such things,” she said, welcoming the stricter expectations on online services.
She noticed differences between those who had online devices while growing up and those who did not.
The older children who grew up largely without social media till their late teens are typically more sociable and focused, said Madam Saliza.
Her younger children who have been exposed to more screen time at a young age are less focused and tend to get upset when they are told to put down their device.
“I would prefer them to immerse themselves in books and building Lego, like what the previous generations did.”
Madam Saliza added: “In such a tech-savvy generation, we cannot stop children from being exposed and using social media and apps as it’s already part of our daily lives. But we need to be aware of what our children are viewing and accessing.”
For her 15-year-old daughter Syaliz, hours feel like seconds when she is using TikTok.
Syaliz said: “I just keep scrolling, and don’t realise that I’ve gone four hours on it.”
The Secondary 4 student said she encounters content related to mental health and body image on her Instagram feed on a daily basis, which led to her becoming more self-conscious about her own body.
“I’ll scroll and see many fit and healthy-looking influencers post about how people call them fat. I started to think, ‘if people consider them fat, then what am I?’”
Mr Bradley Joe, a father of three, said that while tech platforms must do better, a ban would be rushed as it would take away lessons that can be taught to children about social interaction.
The 42-year-old, who works in tech sales, said: “From the ages of 10 to 16, these are formative years, it’s also a time when many of them build deeper relationships, learn about social interaction and acceptance from others is important.”
Experts said that unlike a ban, the Code of Practice for Online Safety for App Distribution Services offers more targeted controls for children.
Dr Chew Han Ei, an adjunct senior research fellow at the Institute of Policy Studies, said the code focuses on app stores as gatekeepers of the digital ecosystem.
“It is practical and less rigid than a blanket ban,” said Dr Chew. “By placing responsibility on app stores to manage access and content, the code offers a targeted solution that can be adapted for different age groups.”
Like many parents, Madam Look walks a fine line between giving Claire, now 12, freedom to manage her own screen time and shielding her from potential harms.
Claire began using Instagram and YouTube when she was 11, limited to 30 minutes per day, said Madam Look, who configured settings to lock her daughter’s phone in sleep mode at night.
Madam Look added: “I set limits because at that young age, she was not able to exercise self-control. The content can be quite addictive.”
If a ban were to be imposed, it is easy to imagine children, who are already exposed to mobile phones, turning to unknown sources to socialise online, creating an even larger headache for parents, she said.
“I think that the blanket ban in Australia is a very blunt tool,” Madam Look said, adding that teenagers should slowly learn to manage their use of social media, rather than to have it totally removed.
“I have seen my daughter learn useful skills from social media, and found healthy inspiration to be creative in schoolwork and her personal hobbies,” she said. “We would lose all of that.”
Source: The Straits Times © SPH Media Limited. Permission required for reproduction.
947