Close

HEADLINES

Headlines published in the last 30 days are listed on SLW.

Online harms: Can app stores be effective gatekeepers?: Opinion

Online harms: Can app stores be effective gatekeepers?: Opinion

Source: Straits Times
Article Date: 03 Feb 2025

Kids have had easy access to harmful content. Singapore's new code hopes to change that.

It was news that gave parents struggling to keep their children safe from the onslaught of harmful digital content, reason to cheer. Singapore’s media regulator announced that it is rolling out a new code requiring app stores to screen and prevent users aged below 18 from downloading apps meant for adults, such as dating apps or those with sexual content. 

This age screening is a key aspect of the Code of Practice for Online Safety for App Distribution Services being introduced by the Infocomm Development Authority of Singapore (IMDA) and which takes effect from March 31.

The code requires five designated app distribution services – Apple App Store, Google Play Store, Huawei App Gallery, Microsoft Store and Samsung Galaxy Store – to implement measures to set community standards, minimise risk of exposure to age-inappropriate harmful content and respond promptly to users’ reports of violations.

As parents well know, current safeguards that app developers use, such as age ratings, are inadequate.  

Problems include the fact that they can be easy for the child to bypass; the content can be unsuitable for the age rating listed; and sometimes apps are updated with new content without the age rating being adjusted.

The new code reinforces protection at the system level by requiring app stores to implement accurate age ratings for apps.

In Singapore, the new code for app stores follows measures including the Code of Practice for Online Safety and the Online Criminal Harms Act introduced in 2023. Just recently, the Government announced that it would set up a new agency to provide support for victims of online harm.

Some might wonder, given these existing regulations which cover both before and after interventions on the part of online communication services – which include app distributors –  is another code required? 

However, the reality is that stark gaps remain despite the code and other measures. Also, while the code is a timely move to help mend these cracks, it is far from bullet-proof.

Global crackdown on harmful content

Boosted by the US Surgeon-General’s calling out of social media platforms’ impact on adolescents’ mental health, governments around the world are increasing their scrutiny of tech companies and introducing measures to protect young users. 

They include the Australian Parliament’s approval of a law that will ban children younger than 16 years old from using social media and Indonesia’s consideration of a similar move. 

Social media services have responded by putting in place features that attempt to reduce users’ exposure to harmful content. For example, as of Jan 21, Instagram users younger than 18 years old in Singapore are being moved into Teens Accounts, where they will be shielded from sensitive and violent content, and strangers. 

Instagram is also trialling more accurate “age assurance” measures, referring to methods to determine a user’s age, including real-time facial scanning technology and checking of other social media accounts linked to the user’s e-mail or phone number. Instagram is also looking into proactively scanning for users who failed to submit their real age by analysing their profile picture and their followers’ demographic profile.

But measures have to be tougher

That said, these measures taken at the platform level are insufficient, compared with retail requirements for other products.

Just as when shoppers of all ages at a supermarket come into contact with a wide range of products, users are exposed to many online products and services when we “enter” an app store. 

When shoppers in Singapore want to purchase “age-rated” alcohol and tobacco products, they must be 18 and 21 respectively. Retail staff have the right to verify shoppers’ ages against their identification documents. Retailers who fail to prevent such products from falling into the hands of underage shoppers are liable to fines and suspension of licences. 

Yet, children and young teens do not have to verify their age when they download apps in a store beyond declaring what their age is.

The scale of this problem – and the reason why age screening at the point of registration or purchase – emerges when we consider that the potential for online harm of children goes beyond the usual suspects. Parents will be aware of the potential harm in games and chat services such as Grand Theft Auto, Fortnite, Roblox, Snapchat and Holla. Some of these are popular games that contain violent content or mature advertisements unsuitable for children. Others are chat services that, in the name of helping children make friends, expose them to potentially dangerous interactions with strangers.

But there are also apps that seem benign, such as those for weight loss. While the content might not be harmful to adults, in the hands of children and teenagers who are image-conscious and susceptible to peer pressure, they encourage body-shaming and extreme dieting behaviour. 

Other apps that seem harmless at first sight include educational apps that exert negative influences on their target audience. Researchers from the University of Michigan Medical School found that apps catering to young children contained adverts that emotionally manipulated them into making in-game purchases and shamed them when they did not. 

What more needs to be done? 

Once the new code comes into effect, app stores have a year to put in place tools to screen users’ estimated age to ensure that children and teenagers cannot access age-inappropriate apps. However, such stores deal with millions of apps, many of which roll out frequent content updates that might render their initial ratings invalid. 

So the code’s requirement for app stores to respond to user reports of harmful content in a timely manner will play an important part in helping them close the gaps. 

But closing the gaps requires more than this. Making the requirement truly effective depends on whether young users, parents, caregivers and others who play an integral part in young people’s lives take action when they come across harmful and inappropriate content.

To empower young users and help their caregivers step up when they come across harmful and inappropriate content, there must be accessible and easy-to-use reporting channels and information on how to make those reports. App stores must ensure that apps providers offering features that allow user-generated content put in place an in-app channel for users to report directly.

The upside is that Singapore is adopting a holistic approach to improve children’s and young people’s technology use. For example, the Ministry of Health, Ministry of Education and Ministry of Social and Family Development recently launched updated guidelines on screen use as part of a new national health strategy to nudge children and adolescents towards healthy lifestyles. So, the roll-out of reporting tools, when ready, must be accompanied by education to grow public awareness and encourage action. 

Another factor to consider is how businesses can further protect children at the device level. For example, the Chinese government has proposed for phone makers to develop a “minors” mode for devices which parents can use to set usage limits and filter content. Features built into the phone are similar to how new models of phones under some brands counter fraud and scams by prohibiting sideloading of apps – the practice of installing mobile apps on a device that is not from the official app stores.

Businesses can also play a part at the point of purchase. Most mobile devices come installed with apps like Google’s Family Link and Apple’s Screen Time that provide filtering options and parental controls such as turn-off functions for in-app purchasing and app store access. Retailers can advise parents and caregivers on how best to use such settings.

Solving a problem like online harm requires intervention at different levels. The new code increases protection for young users, but helping them navigate the online space safely while reaping the benefits of technology requires the commitment of all parties. 

Dr Carol Soon is associate professor in media and digital policy at the department of communications and new media at NUS. She is also vice-chair of the Media Literacy Council.

Source: The Straits Times © SPH Media Limited. Permission required for reproduction.

Print
240

Latest Headlines

No content

A problem occurred while loading content.

Previous Next

Terms Of Use Privacy Statement Copyright 2025 by Singapore Academy of Law
Back To Top