Close

HEADLINES

Headlines published in the last 30 days are listed on SLW.

More than half of reports of harm not dealt with in the first instance: IMDA audit of social media firms

More than half of reports of harm not dealt with in the first instance: IMDA audit of social media firms

Source: Straits Times
Article Date: 18 Feb 2025
Author: Osmond Chia

Most social media firms took an average of five days or more to act on user reports of harmful content.

More than half of legitimate user complaints about content related to child abuse or cyber bullying, among other harms, were not addressed in the first instance by social media companies, Singapore’s media regulator has found.

Most social media companies also took an average of five days or more to act on user reports of harmful content. This is considerably longer than what social media companies claimed in their annual reports, said the Infocomm Media Development Authority (IMDA) in its inaugural Online Safety Assessment Report published on Feb 17.

The IMDA study assesses how well six major social media platforms – Facebook, Instagram, TikTok, YouTube, X (formerly Twitter) and HardwareZone – protect their users, and includes a “mystery shopper” test.

Posing as members of the public between August 2023 and July 2024, IMDA analysts flagged more than 1,000 harmful posts across the six social media platforms and observed their responses.

Instagram and X performed the poorest.

Instagram acted on only 2 per cent of the user reports raised, taking an average of seven days to respond.

Most of the harmful content was taken down only after Instagram was notified by IMDA.

X took action on just over half of harmful content flagged, and dealt with most of the remaining posts after IMDA notified the platform.

Also, X took an average of seven to nine days to deal with flagged sexual or self-harm material, and up to 20 days for other categories of harmful content, IMDA reported. The findings exceeded the median time, stated by X, of 15 hours to take action on user reports.

HardwareZone performed the best, responding to nearly nine in 10 cases within roughly three days each.

Harmful content is defined as material that is sexual, violent, related to self-harm, suicide, cyber bullying or vice, or others that can endanger public health.

IMDA’s Code of Practice for Online Safety, rolled out in July 2023, requires social media companies to minimise Singapore users’ exposure to such harm, with additional protection for children.

Overall, the six platforms were graded based on four key areas: safety measures for all users, safety measures for under-18 users, user reporting and resolution (which took into account results of the mystery shopper test), as well as accountability.

Accountability is defined as providing details on the number and types of harmful content removed, and the time taken to do so, as a result of user reports.

TikTok earned the highest overall score. It fared well in all areas except for user reporting and resolution.

Facebook, Instagram, YouTube and HardwareZone came in joint second, each with 3.5 points. Each had comprehensive safety measures for general users, but was lacking in other areas, such as measures for under-18 users or in the effectiveness of its reporting channels.

X earned the lowest score of 2.5 out of 5, owing to a lack of user protections across all the areas under the study.

Under IMDA’s Code of Practice for Online Safety, reports from users must be assessed and dealt with accordingly in a timely and diligent manner to reduce harm, such as by removing the reported content and warning, suspending or blocking the offending user.

Those who make a report should also be informed about the platform’s decision and action taken without undue delay.

Digital Development and Information Minister Josephine Teo said at the closing of a closed-door discussion with online safety organisations and representatives from the social media platforms on Feb 17 that the IMDA report sets society’s expectations on behalf of users here.

“It is recognised that when it comes to responding to user reports, the response can be faster,” she said.

“For me, the most significant finding is the content that children are still inevitably exposed to – even when they are not pretending to be adults,” Mrs Teo said, adding that the industry should make it a priority to address such harmful material.

“We now have a common set of metrics,” she said. “We are encouraged that our partners in the industry have taken a constructive approach to deal with the findings.”

IMDA said in its findings that most platforms had implemented safety measures for all users, including having community guidelines, content moderation and safety resources, but needed to shore up measures to protect children from age-inappropriate and harmful content.

Particularly, HardwareZone and X performed the poorest, with 2.5 and 2 points, respectively, when it came to protecting children online.

IMDA called out X for failing to proactively detect and remove child sexual exploitation and abuse material, saying it detected far more of such material originating from Singapore than was stated in X’s own reports.

“Children’s accounts could easily find and access explicit adult sexual content, especially hardcore pornography, with simple search terms,” said IMDA, urging X to provide an update on steps to improve safety for children.

As for HardwareZone, IMDA said users could easily slip past its age-gating system to view mature content, which is guarded by a basic prompt that simply asks users if they are over 18 years old.

Even on Facebook and YouTube, which each earned 4 out of 5 marks for child safety, there were instances where children’s accounts could access mature content that should have been restricted.

IMDA said platforms can decide for themselves what tools to use to protect users. It is also exploring how social media services can use age assurance technology to protect youth online, following a new policy to require designated app stores like Apple’s and Google’s to implement such measures by 2026.

The Ministry of Digital Development and Information reported in its 2024 Online Safety Poll that nearly three in four internet users surveyed encountered such harmful content online, and that only about a quarter of them reported it.

Source: The Straits Times © SPH Media Limited. Permission required for reproduction.

Print
459

Latest Headlines

Singapore Academy of Law / 20 Feb 2025

ADV: JLP Opening Conference and Masterclass

The opening conference and launch of the Junior Lawyers Professional Certification Programme (JLP) will take place on 21 May 2025. Key highlights include a keynote address delivered by the Chief Justice Sundaresh Menon and an inspiring fireside...

No content

A problem occurred while loading content.

Previous Next

Terms Of Use Privacy Statement Copyright 2025 by Singapore Academy of Law
Back To Top