Parliament
Speech by He Ting Ru On MDDI: Indecent AI content

Speech by He Ting Ru On MDDI: Indecent AI content

He Ting Ru
He Ting Ru
Delivered in Parliament on
2
March 2026
5
min read

The upcoming enactment of OSRA increases support for victims of indecent online content. IMDA has been engaging with X over Grok’s generation of non-consensual intimate images that were distributed en-masse on the X platform. IMDA said that X has taken measures to address the issue, including stopping Grok from producing such content. Even as we ensure that the operating environment for tech platforms is not overly restrictive, could the government explain further the outcomes of its engagement with X? Were any punitive actions taken over the matter? After introducing the “Spicy Mode” feature, Grok rose to the top 25 apps in the free Singapore Apple App Store in January this year. 

The upcoming enactment of OSRA increases support for victims of indecent online content. 

 

IMDA has been engaging with X over Grok’s generation of non-consensual intimate images that were distributed en-masse on the X platform. IMDA said that X has taken measures to address the issue, including stopping Grok from producing such content. Even as we ensure that the operating environment for tech platforms is not overly restrictive, could the government explain further the outcomes of its engagement with X? Were any punitive actions taken over the matter? After introducing the “Spicy Mode” feature, Grok rose to the top 25 apps in the free Singapore Apple App Store in January this year. 

 

Secondly, I quoted in my intervention during last year’s COS that there are reports of students generating deepfake nudes of their classmates and sharing them in Whatsapp groups. We must thus tackle the real problem: the existing (and increasing) demand for sexualised images, which is exacerbated by accessibility. Given that most victims are women and children, the increased accessibility puts further pressure on them. We must do much more to educate our youths on the usage of AI, especially with increased exposure to it and as early as Primary 4. 

 

Given concerns about how children handle AI, how do MOE’s sexual education approach and AI framework cover the issue explicitly? And how does MOE negotiate students’ emotional engagement with AI chatbots? The relationship between these images and the development of young Singaporeans is especially relevant as platforms work to become more addictive. More concerns beyond our existing legislation may become pertinent, such as content that does not involve specific victims but nonetheless have societal concerns, such as AI-generated child pornography.

Social media and children

A child doomscrolling past bedtime is not making a choice. They are responding to a system designed to make stopping almost impossible.

The current age assurance assessment, the Online Safety (Relief and Accountability) Bill, and the Code of Practice for Online Safety represent a concerted effort to protect children online. Today, I want to ask whether it addresses a distinction not yet resolved: the difference between content harm and design harm.

 

Singapore already understands this. Regulation of the Sentosa and MBS casinos builds deliberate friction through entry levies, exclusion orders, and visit limits. This recognises the need for behavioural design interruption, not just better information about the risks.

On social media platforms, infinite scroll, autoplay videos, and algorithmic feeds are attention-capture dark patterns designed to maximise engagement by exploiting reward-seeking and eroding self-regulation in children whose brains are still developing.

 

Last month, the European Commission made a preliminary finding that TikTok’s addictive design is itself a legal violation. TikTok is designated under our own Code of Practice, and the Commission found that its screentime tools and parental controls do not effectively address these risks. Silence from Singapore adds a reputational risk.

 

An article in Nature Health last week stated that we must hold platforms accountable for their addictive design. These platforms exploit children’s brains and erode children’s capacity for self-regulation.

The question is therefore whether we should allow platforms to deploy attention-capture dark patterns against children without legal consequences.

 

Could the Minister thus clarify three things.

 

One, does the Code of Practice require designated services to submit a design risk assessment, covering recommendation systems, autoplay, and scroll architecture? And does IMDA have power to act on those assessments independently of content classification. If so, will we commit to a timeline for doing so?

 

Two, given TikTok’s preliminary breach finding for addictive design, has IMDA reviewed TikTok’s compliance report with this in mind?

 

Three, would the Ministry consider my call last week to use a select committee to better examine global efforts to protect children from the harms of social media, especially in light of momentum building to outright ban social media for children?

 

Both children and their parents deserve a framework that holds platforms accountable not just for what they show, but for how they are built. Digital environments do not shape themselves. They are designed. And design, when left unchecked, becomes policy by default.

Lessons from the Albatross files

History is not one-dimensional. It constantly awaits further completion with access to more information. Knowledge about the past is important in shaping how we think and act in the present in tangible ways. Declassification is critical to this process.

 

More credible and independently verifiable information is crucial when disinformation, misinformation, confusion, and uncertainty are rife. Transparency is not just for transparency’s sake.

 

Recent access to the Albatross Files underscores that Separation with Malaysia was by mutual agreement. This makes it more possible to go beyond the narratives of trauma surrounding being “kicked out” in Singapore as we look to advance ties with our closest neighbour.

 

Opening the Epstein files enabled some of the richest and most powerful people in the world being held to account for wrongdoing, and led to figures like Andrew Mountbatten-Windsor and Peter Mandelson to be arrested. Access and accountability are especially important when the concerned persons remain alive, regardless of whether an issue is particularly heinous or more mundane.

 

Minister Josephine Teo stated that when deciding on access to public archives, state agencies take into consideration, and I quote, “supporting research into our collective past, while safeguarding sensitive information and complying with relevant confidentiality and other obligations.” End quote. 

 

We should add timelines, holding state agencies and political authority to public account, and avoiding confusion as well as misrepresentation. We pledge to aspire toward democracy. In a democracy, state action needs to be defensible to the public it serves and from which it receives funding. 

 

Publicly indefensible positions and actions should not be undertaken. Decisions and behaviour must be ready to stand public scrutiny at any time. Knowledge of this possibility encourages greater prudence and responsibility. This may make matters more challenging but doing the right thing often is.

 

Singapore deserves a systematic archival declassification and cataloguing process with clear, accessible formal procedures for requesting and reviewing holdings that include channels for appeal.

 

 

Categories
Back to top
Workers' Party members working hard to set up a GE2025 rally

Walk with us, #StepUp with the Workers’ Party

Join us in building a brighter future for all Singaporeans. Whether you lend your time, energy, or resources, your support makes a difference.