Mr Speaker,
The Workers’ Party understands the motivation behind this Bill, and wishes to support it. But we have some reservations.
Ms He Ting Ru has addressed the first question this Bill raises: Will it adequately protect victims? She has outlined how we can strengthen that protection, particularly for vulnerable victims. I support those amendments.
I address the second question: Will this Bill be fair, accountable, and properly calibrated in its exercise of power?
This Bill grants the Commissioner significant authority—to issue directions, compel removal of content, impose obligations on service providers, and reduce engagement with material without the creator’s knowledge. These are necessary powers to address real harms. But they are also powers that should be carefully designed for their intended purpose, and appropriately constrained by institutional checks and balances.
I will address three amendments that address our concerns about the Bill’s architecture, and six areas requiring ministerial clarification.
Let me begin with our amendments.
First, raising the threshold for state action.
Clause 26 sets the threshold at which the Commissioner may issue directions. The current text reads: “reason to suspect”.
That threshold is too low. Too low for the powers being granted. 'Reason to suspect' is subjective. It permits action based on intuition or preliminary information, without requiring objective evidence.
Our amendment raises this to “reasonable grounds to believe”—the same standard used in the United Kingdom’s Online Safety Act 2023 and proposed in Canada’s Bill C-63. This requires evidence that would satisfy a reasonable person, not merely a suspicion.
Some may argue this is semantics. It is not. In the UK Act, Section 105 distinguishes between “reasonable grounds to suspect” for beginning an investigation and “reasonable grounds to believe” for taking enforcement action. The former permits inquiry. The latter permits coercion. That distinction matters.
If we are serious about protecting victims, we must be equally serious about ensuring the Commissioner’s enforcement powers rest on evidence, not suspicion. Our amendment achieves both.
The second set of amendments I will be addressing concern legitimate speech.
Clauses 9, 11, and 19 define three online harms. Online harassment. Non-consensual disclosure of private information. Instigation of disproportionate harm. These definitions are necessary. But incomplete.
Let me give three scenarios:
A citizen posts fair criticism of a public official’s conduct. If a reasonable person would conclude that criticism is “abusive” or “insulting” and is likely to cause the official “distress”, under clause 9, this could be harassment.
A victim of harassment publishes text messages from their harasser online—as a call for help, as a warning to others. Under Clause 11, this could be non-consensual disclosure of private information.
A journalist publishes leaked documents exposing corruption in a government-linked entity, but under Clause 11, this could be non-consensual disclosure.
Mr Speaker, I do not suggest these outcomes are intended. But the Bill as drafted permits them. The definitions contain minimal carve-outs. No obvious exclusions for public interest. They do not go far enough in recognising that not all disclosures of private information are harmful, and not all uncomfortable speech is harassment.
Our amendments insert these safeguards.
For Clause 9, we add: communication is not harassment if it constitutes fair comment on a matter of public interest. This is drawn from the established common law defence to the tort of defamation.
For Clause 11, we add: disclosure does not fall within the definition if the public interest in disclosure outweighs the public interest in privacy. We list seven examples, including exposing wrongdoing, informing the public on matters of significant concern, and protecting public health and safety. This amendment is modelled on a well established balancing test in other common law jurisdictions like the UK, and the drafting is similar to the test as codified in the Australian Privacy Act.
For Clause 19, we add: communication does not constitute instigation if it relates to a matter of public interest.
These amendments do not weaken the Bill. They sharpen it. They ensure the Commissioner’s powers are used to protect victims, not to chill legitimate speech. They prevent this Bill from inadvertently silencing criticism, investigative journalism, or public-interest disclosures.
The third set of amendments establishes independent oversight.
The Bill establishes an internal appeal mechanism. Clause 60 creates an Appeal Panel whose members are appointed by the Minister. Clause 63 provides the right to appeal Commissioner decisions to an Appeal Committee drawn from this Panel. Crucially, Clause 63(5) and (6) state there is “no further appeal” beyond the Appeal Committee. This makes a Ministerially-appointed committee the final arbiter.
Our amendment deletes Clause 63(5) and (6) and inserts a new Clause C establishing a right of appeal to the General Division of the High Court.
The proposed appeal is not unlimited. It is confined to three grounds: a point of law, that the harmful activity did not occur, or that compliance is not technically possible. This ensures the courts do not become a general review body for every Commissioner decision, but remain available as an independent check on questions of legality, fact, and feasibility.
Mr Speaker, this is not about distrusting the Commissioner or the Minister. This is about institutional design. When the state exercises coercive power—power that can affect livelihoods, reputations, and businesses—there must be a route to independent judicial appeals of cases. Not an internal appeal to a Ministerially-appointed committee. But the courts, who already routinely adjudicate on matters of legality and fact.
Some may say the courts will be burdened. But the courts exist precisely for this purpose. Our proposed amendment is also limited in scope. Independent oversight of executive action is not a burden on the system. It is the system. Our amendment preserves that principle.
Mr Speaker, beyond these three amendments, I turn now to six areas where the Bill requires clarification from the Minister.
First, the scope of doxxing.
Clause 10 defines doxxing as publishing identity information where “a reasonable person would conclude was likely to have been intended to cause harassment, alarm, distress or humiliation” to the victim.
Here is my concern: A victim identifies their hitherto anonymous harasser online to warn others in the community. Could a reasonable person conclude this was “intended to cause” the harasser distress or alarm? The victim’s intent may be protective, but the effect—and perhaps the foreseeable effect—is to cause the harasser alarm.
The Bill does not answer this. I ask the Minister: Will victims who expose their harassers in this manner be caught by Clause 10? If not, what safeguards exist in the Bill’s design to prevent this?
Second, standing to appeal when directions are given to platforms.
Clause 28 sets out who receives Part 5 directions. Some directions—stop communication, restraining—can be issued directly to the communicator. But others—access disabling, account restriction, engagement reduction—are issued to platforms or administrators. When a direction is not issued to the communicator directly, can they still appeal? Clause 61(1)(e) allows the recipient to appeal. But the communicator is not the recipient. Clause 61(1)(f) says they may appeal only if they fall within a description the Minister “may” prescribe under Clause 82.
I ask the Minister: Will regulations be made to ensure that communicators have standing to appeal directions that restrict their content, even when those directions are given to platforms rather than to them directly? If so, when? Or will this remain at Ministerial discretion?
Third, the meaning of “prescribed connection to Singapore”.
Clause 22 determines who is eligible to make a report. Citizens, yes. PRs, yes. But also anyone with a “prescribed connection to Singapore.” What does that mean? The Bill does not say.
Does it cover a foreign spouse on a Long-Term Visit Pass, harassed by someone in Singapore? Does it cover a former Singapore resident, now overseas, targeted by a Singapore-based individual?
I ask the Minister: Who is protected by this Bill? Who is excluded? What will “prescribed connection” mean when the regulations are eventually drafted?
Fourth, the exemption for public agencies.
Clause 4(2) states that public agencies cannot be given directions or orders under Part 5. Part 5 contains the Commissioner’s enforcement powers—stop communication directions, access disabling directions, restraining directions, and so on. This means that if harmful content originates from, is hosted by, or is facilitated by a public agency, the Commissioner cannot compel that agency to act.
Clause 4(3) goes further: public agencies cannot be sued under the civil proceedings provisions in Parts 10, 11, and 12.
Mr Speaker, online harm is online harm, regardless of its source. A citizen harassed through content on a government platform, or by a government account, experiences the same distress as one harassed on a private platform or by a private account.
I ask the Minister: Why are public agencies exempt from the Commissioner’s enforcement powers and from civil liability? What is the policy rationale for this asymmetry? What recourse will an individual have, if they experience harassment from a rogue public employee, using an official account?
Fifth, engagement reduction and class-of-material directions.
Clause 40 grants the Commissioner power to issue “Engagement Reduction Directions”. This allows the Commissioner to require a service provider to reduce the engagement of end-users with a class of material—without removing it. Clause 41(3) explicitly states it is “not necessary to give any person who may be affected by a Part 5 direction an opportunity to be heard before the direction is given.”
This means users are not informed. The content stays online, visible to the creator, visible to the victim who reported it. None will be aware that the content’s reach has been throttled. The victim will continue to see the harm, wondering if their report achieved anything at all.
This is a shadow ban.
So my question to the Minister is simple: What is the envisaged use of this power? Surely, in cases of legitimate, serious harms, one of the other more forceful directions is the better solution for victims? And if this is meant as a “less forceful” direction, why intervene at all?"
Clause 40, along with clauses 30 and 33, is also a “class of material” direction that raises a related concern. These allow the Commissioner to issue stop communication directions against entire categories of content identified by “specific identifiers”—a username, a term, an online location. This runs the risk of functioning as a digital dragnet that may capture legitimate content alongside harmful material.
For example, a victims’ advocacy group reporting on an emerging category of new online harms as a warning to the community, may see their material swept up by mistake.
On class-of-material directions, I have two questions: What safeguards exist to prevent over-blocking? And what remedy exists for users whose legitimate content is caught by these directions?
Sixth, and finally, consistency in decision-making.
Parts 10 to 12 of this Bill establish statutory torts that will be adjudicated by courts. These will generate judicial precedents and published decisions. There will be transparency and consistency through the common law process.
But Part 5 directions—which may affect just as many people—are quasi-judicial decisions by the Commissioner. There is no requirement for published reasons or decisions. There is no public record of how the Commissioner interprets and applies the definitions of harms in Part 3 and other sections.
I ask the Minister two questions:
First: Will the Commissioner be bound by prior decisions, or will each case be decided on fresh discretion?
Secondly: Will the Commissioner be bound by clarifications given by the Minister in this House? If the Minister states today that Clause 10’s doxxing definition does not capture victims exposing their harassers, can future Commissioners be held to that clarification?
Mr Speaker, this Bill introduces fundamental protections for victims of online harm. We support those protections.
Our amendments ensure those powers are exercised on evidence, with safeguards for legitimate speech, and with independent judicial oversight. But oversight is meaningless if those affected cannot access it.
The Bill’s ambiguities—on doxxing, on standing to appeal, on coverage, on public agency exemptions, on shadow bans, on consistency in decision making—are not minor. They go to the heart of who is protected, who is excluded, and whether this regime operates fairly. I hope that in the course of this debate, that we get clarity and comfort on these questions.
I urge the Minister to accept our amendments. If the Minister declines, I ask for clear answers to the questions I have raised. Singaporeans deserve a regime that protects victims and respects fairness in equal measure.

