Mr Speaker
I believe that members of this House appreciate the growing concerns relating to online harms, and also the complex and evolving nature of these potential harms. From cyber-bullying, rise in hate speech to AI-Generated content, online exploitation, misinformation and gender-based violence, we are having to deal with a wide variety of harms which have the added threat of going viral.
Singapore has a virtually 100% internet-connectivity rate, and consequently, everyone is at risk from online harms.
Research in Singapore appears to bear out this fear. A survey released by MDDI last month found that 4 in 5 respondents have encountered harmful online content.
In May and June 2023, SG Her Empowerment’s survey on online harms found that one in two respondents aged between 15 to 24 reported having been victims of online harms. The same survey also reported that two in five victims reported experiencing at least one form of severe adverse impacts, including suicide ideation, and physical and mental health issues.
To further add to this, a World Economic Forum article published at the start of this year noted that there is a broader trend where online platforms move away from centralised content moderations, and instead rely more on user contributions to address potentially misleading or harmful content. This has led to various groups being concerned that this shift would worsen the situation for vulnerable groups and create a less safe digital environment. And this is concerning, as I mentioned during my COS MHA cut on ‘safe AI’ this year, women and children are two groups that have been found to bear disproportionately the harms associated with such online activities.
With the subject garnering greater concern in recent years, the Online Safety (Relief and Accountability) Bill is now before us. The Workers’ Party agrees that tough measures are required to tackle these harms, to allow victims to better seek redress and healing, and to ensure that platforms act responsibly.
Whatever the online harms, be they categorised by content, contact, conduct or contract, or by types of harms, such as aggressive, sexual or extremist values, the thing that unifies them, is that their effects on victims or witnesses are cross-cutting. All these harms have effects which are threats to physical or mental health, threats to privacy in the forms of violations, or further promote inequalities or discrimination.
However, our position is that the Bill before us leaves certain areas of concern unaddressed. It it thus in the spirit of strengthening our online protection regime to better protect victims of such harms that we propose three main areas of amendment:
First, the bill leaves out certain harms which result in severe impact on the victims.
Second, we have concerns about the legal procedures enabling the Commissioner to be the final arbiter, and that certain clauses of what constitutes an online harm need further refinement.
Third, there should be reporting requirements from the Commission, which will enhance public understanding and education about the harms, our protection regime and ultimately build confidence in what we are doing to tackle it.
I will focus my speech on the first set of amendments and how we can better ensure that our tackling of online harms puts victims and vulnerable groups front and centre of our efforts. My other Workers’ Party colleagues will then cover the other sets of amendments.
Expanding the scope of online harm activities
In our amendments, we propose the statutory addition of two sets of activities in the definition of online harm activities:
First, sexual exploitation of children or vulnerable adults, otherwise known as sexual grooming, with wording proposed in a new subsection (o) to the definition of “online harmful activities” contained in Clause 3, and expanded definitions of what this comprises in the new Clause A.
Second, publication of online material encouraging or promoting suicide or self-harm, with our proposed insertion of new subsection (o) to Clause 3’s definition of “online harmful activities”, and expansions on this in the new Clause B.
In a study published by the Institute of Policy Studies in October 2025, child sexual exploitation and promotion of dangerous behaviours were identified as top harms, and perceived as online harms of greater security.
Yet the Bill before us does not include the publication of online material encouraging or promoting suicide or self-harm. This is concerning, as in September 2025, it was reported that teenagers on Instagram were still able to access content relating to suicide and self harm, and that its ‘teen accounts’ function did not appear to be stopping sexual content being uploaded by children. Instagram is not the only platform noted for the risks associated with the promotion of self-harm and suicide.
In Singapore, SG Her Empowerment’s survey ranked sexual harassment as the top online harm encountered by survivors and witnesses, with female youth aged between 15-34 more concerned about sexual-based harms. MDDI’s survey also noted that 26% of the respondents reported that they have encountered harmful content of a sexual nature.
Thus, with easy access to posts and content online, the high risks of sexual grooming of our minors and vulnerable adults are also of grave concern, but do not appear to be explicitly covered by the provisions of this Bill. Even though the Online Criminal Harms Act covers certain sexual offences, and various Codes of Practice have been introduced over the years, these harms do not appear to benefit from the full suite of mechanisms in this bill which I believe are more responsive and effective in tackling online safety issues that are highly context-dependent and time sensitive. In particular, I refer to the Bill’s ability to request to restrain or stop the communication of a class of material by an administrator or communicator.
Furthermore, perpetrators seeking to groom minors or vulnerable adults may share online content that encourages, promotes or provides instructions of sexual communication or sexual activity to minors or vulnerable adults. Additionally, publication or communication of such online content – which may be directed at certain groups of persons, not just an individual – would not be covered by the Online Criminal Harms Act.
Thus, by excluding sexual grooming of minors or vulnerable adults under this Bill, we miss out an important protection of our children and vulnerable adults in the form of for example allowing take-down orders to be made. As the EU has pointed out, the risk is that when a child is exposed to or engages with inappropriate sexualised content, they risk ending up at greater risk of related conduct – be it in the form of becoming targets of, or perpetrators of sexual grooming of minors – because it has become normalised or desensitised for them. These children may also then become targets for sexual exploitation, or streaming of child sexual abuse material.
Thus the harm posed by content posted to glorify suicide or self-harm, and sexual grooming of minors and vulnerable adults, is substantial. Thus we hope that Singapore takes a strong stance against such online content by accepting our proposed amendments and including them under the scope of this bill, to provide more tools to tackle these types of online harm.
Protection of victims
Next, victim protection and support. This is complex, as harassment, humiliation and abuse of victims comes in various shapes and forms. Harassment, as we all know, is not limited to physical stalking, sending thousands of text messages or making dodgy phone calls. For example, our family courts are also now are en-route to recognising that it is not just physical abuse that causes real and sometimes life-long and life-threatening harms to victims. Increasingly, they know that other forms of abuse can be just as harmful, and online media can be one means through which they are propagated. We already have laws on the books regarding self-harm, sexualized grooming of minors, and protecting vulnerable adults on the books, but they do not explicitly extend to online behaviour. Our amendments seek to harmonise the laws and extend protections given emergent risks in the online space.
Additionally, we have to understand that the harm experienced by victims does not end after a report or complaint is filed. It could also extend beyond when the perpetrator has been sentenced, long after the justice system has taken its course. If there are any parallel processes such as actions taken out as statutory torts or penal code offences, the victim would often have to re-live and recount their experiences, the impact it has had on them, and if there is a trial, subject to cross-examination of the extremely traumatic event or events.
This is backed up by SHE’s survey, which found that two in five victims of online harm experienced serious emotional or mental health impact such as depression or fear for safety. Many withdrew from social media entirely, as documented in one of the case studies in the IPS survey. A handful of respondents even contemplated harming themselves physically or attempted suicide. Even more crucially, for each person who steps forward, how many decide against pursuing matters through the justice system, because it is daunting, because they are fearful of it, or decide they did not want to re-traumatise themselves by recounting and re-living what has happened to them many times as they go through the system to seek redress.
Thus we also have concerns about the access to justice for victims who are contemplating pursuing justice under the statutory tort provisions. The inclusion of these statutory torts are welcome, and empowering, but we must not forget that to file a civil claim, victims must gather evidence, bear legal costs, and re-live the harms done to them time and again. Many would be young people, students, workers who already feel powerless. For them, the idea of commencing a lawsuit is unimaginable.
So how do we tackle concerns that a fragmented system of relief may emerge: that those who are resourced can fight, and those without must simply tolerate and try to move on? Would the legal processes be simplified for victims to obtain remedies that they deserve? Directions or orders issued by the Commissioner should also be granted swiftly.
And since under the Bill, the Commissioner is to wield quasi-judicial powers who will sit in judgement in handling reports and complaints filed by victims or agencies, the Commission must be staffed and resourced like a quasi-judicial body, not a customer service centre. Decisions to be made by the Commissioner are not mechanical, run-of-the-mill decisions. They require legal, psychological, societal and cultural sensitivity. Will the new agency have specialists such as psychologists who understand trauma-informed approaches, gender based-violence experts, lawyers experienced in not only defamation and harassment law, but who also understand the often insidious and subtle means in which perpetrators of online harms attempt to assert power and coercive control over their targets, sometimes in the context of post-separation abuse.
This means that our processes to address online harms and provide redress and justice for victims must incorporate a victim-centric and trauma-informed approach. Staff from frontliners receiving reports from victims, to decision-makers within the agency should be supported by professionals who understand how best to continue to support and protect victims. Trying to take a stand against harassment, humiliation and abuse requires much from victims. It would run contrary to the intent of this Bill if victims instead find themselves with no support or even met with disbelief when seeking justice.
The principle behind tackling these harms would be to ensure that there is restorative justice, especially for non-criminal harms, which we as a society decided do not warrant criminalisation.
As for perpetrators, we too should try to get to the bottom of their motivations and what causes such behaviour. Rehabilitation is thus as important as deterrence. As AWARE has recommended, would counselling orders for perpetrators be one of the tools that could be given to perpetrators, so that where appropriate, they would receive appropriate treatment to stop them from re-offending after their sentence has been meted out?
Education and information as inoculators against harms
In this vein, I believe that aside from providing psychological support for victims, the Commission should prioritise education as an inoculator against these harms. For OSC officers, this would mean ensuring evidence-based, up-to-date victim-centric education and training for those handling reports or complaints to continually incorporate latest victim-centric approaches to provide victims the support they need. Ground-up knowledge sharing is also important, given that so much of online behaviour is driven by a social media culture that is global, fast-changing, and trend-driven.
We must continue and step up cross-agency and sectoral education efforts for both children and adults, to increase awareness of online harm, and the help available to them. These efforts have to continue to be informed by updated research on the evolving nature of the harms, how they are propagated, and take on board the latest online trends, which often move rapidly. Apart from targeting the broad public who may be or were exposed to harmful online content, they should also be proactive in nature, and target perpetrators or those at-risk of offending. Educational efforts should also be sensitive to, and address research findings that those exposed to online harmful content seem to display a higher probability of becoming offenders themselves be designed to target those who may be causing both criminal and non-criminal harms.
Now I would turn to the other categories of amendments.
Preserving justice and transparency
While we take a strong stance against online harm, we must balance it against the risk that overreach will strip away the normalcy of our online usage. As such, we have tabled amendments to clauses 9,11, 19 and 26, to ensure that individuals can communicate online material that constitutes fair comment on matters of public interest. The Commissioner can still issue directions or orders when it has reasonable grounds to believe that online harmful activity was conducted.
To further strengthen our understanding and faith in the regime, we have also included new clause C, allowing appeals to the judicial system. This would allow both online entities and individual users transparency.
My colleague NCMP Andre Low will elaborate more on these points.
Assessing our protection regime
The final group of amendments that we have tabled call for the Commissioner to prepare and submit an annual report to Parliament, and specify the areas which should be covered by the report. These include the number of reports received by the Commissioner, the categories of these reports, the number of directions and orders issued by the Commission, the categories of persons or entities who have been issued with a direction or order, and findings by the Commissioner on the risk assessments and trends of online harm.
The proposed new clause E also gives the Commissioner explicit powers to require online service providers to disclose information about their own measures to tackle online harms. Such information is particularly necessary in line with the IPS findings that 75% think that apart from government and users, tech companies also must do more to tackle online harms, and that a 2024 MDDI survey found that 80% of respondents who reported online harms experienced issues with platforms’ reporting processes.
These proposed amendments are driven by the need to ensure that our laws are better understood, and remain ahead of the rapidly evolving landscape of online harms. They also provide that groups disproportionately by online harms – women, children and vulnerable adults – would also have their interests represented.
NCMP Eileen Chong will speak more on these amendments.
Mr Speaker, in conclusion, I would like to emphasise that these proposed amendments aim to strengthen and develop this Bill in line with the government’s stated intentions. With this Bill, we believe there is an opportunity to act more effectively against material on suicide, self-injury, and sexual grooming of minors and vulnerable adults. We have the chance to legislate more clearly to ensure that victims can efficiently, effectively, and safely get recourse for online harms. We should also work to make online harms less likely to happen in the first place, through education, research, and by ensuring platforms are aware of their role in facing up to the challenge, all while ensuring the justice system works effectively and as intended.
I believe that all of us in this House, as Parliamentarians, as Singaporeans, and as human beings, wish online harms were less common and less hurtful than they are. That said, we cannot un-invent the internet. And we are at a point in time where the digital world has made it so easy for people to harm each other without having to directly experience the hurt they have caused others. Our duty to remedy harms is highly complicated, and the balance that we seek in legislation and practice will be tested by edge cases, unforeseen circumstances, and changes in trends and technology. But today, to the best of my knowledge and from speaking to experts, NGOs, and everyday people who have to deal with these harms, we can legislate better with the amendments I have proposed, and I hope that this House will accept them.
Thank you.


