Mr Speaker,
Before I begin, I would like to first declare that I work for a financial technology company that may fall under the definition of online service provider as envisaged in this bill.
The Workers’ Party understands the motivation behind this Bill, and we deeply wish to support it. But we have some reservations.
Ms He Ting Ru has addressed the first question this Bill raises: Will it adequately protect victims? She has outlined how we can strengthen that protection, particularly for vulnerable victims. I support those amendments.
I address the second question: Will this Bill be fair, accountable, and properly calibrated in its exercise of power?
This Bill grants the Commissioner significant authority—to issue directions, compel removal of content, impose obligations on service providers, and reduce engagement with material without the creator’s knowledge. These are necessary powers to address real harms. But they are also powers that should be carefully designed for their intended purpose, and appropriately constrained by institutional checks and balances.
I will address three amendments that address our concerns about the Bill’s architecture, and six areas requiring ministerial clarification.
Amendments
Let me begin with our amendments.
First, raising the threshold for state action.
Clause 26 sets the threshold at which the Commissioner may issue directions. The current text reads: “reason to suspect”.
We believe that threshold is too low. Too low for the powers being granted. 'Reason to suspect' is a subjective test. It permits action based on intuition or preliminary information, without requiring objective evidence.
Our amendment raises this to “reasonable grounds to believe”. This is the same standard used in comparable legislation overseas, including the United Kingdom’s Online Safety Act 2023 and Canada's proposed bill C-63 on online harms. This requires evidence that would satisfy a reasonable person, not merely a suspicion.
Some may argue this is semantics. We do not believe this is so. In the UK Act, they expressly distinguish between “reasonable grounds to suspect” for beginning an investigation and “reasonable grounds to believe” for taking enforcement action. We believe that this is the right approach. The former permits inquiry. The latter permits coercion. That distinction matters.
If we are serious about protecting victims, we must be equally serious about ensuring the Commissioner’s enforcement powers rest on evidence, not suspicion. Our amendment achieves both.
The second set of amendments I will be addressing concern legitimate discourse.
Clauses 9, 11, and 19 define three online harms. Online harassment. Non-consensual disclosure of private information. Instigation of disproportionate harm. These definitions are necessary. But we think they are incomplete.
Let me give three scenarios:
A citizen posts fair criticism of a public official’s conduct. If a reasonable person would conclude that criticism is “abusive” or “insulting” and is likely to cause the official “distress”, under clause 9, this could be harassment.
A victim of harassment publishes text messages from their harasser online—as a call for help, as a warning to others. Under Clause 11, this could be non-consensual disclosure of private information.
A journalist publishes leaked documents exposing corruption in a government-linked entity, but under Clause 11, this could also fall under the definition of non-consensual disclosure.
Mr Speaker, I do not suggest these outcomes are intended. But the Bill as drafted seems to permit them. The definitions contain minimal carve-outs. There are no obvious exclusions for public interest, and they do not go far enough in recognising that not all disclosures of private information are harmful, and not all uncomfortable speech is harassment.
Our amendments insert these safeguards.
For Clause 9, we add: communication is not harassment if it constitutes fair comment on a matter of public interest. This is drawn from the established common law defence to the tort of defamation.
For Clause 11, we add: disclosure does not fall within the definition if the public interest in disclosure outweighs the public interest in privacy. We list seven examples, including exposing wrongdoing, informing the public on matters of significant concern, and protecting public health and safety. This amendment is modelled on a well established balancing test in other common law jurisdictions, including the UK. As drafted, it is also very similar to the test as codified in the Australian Privacy Act.
For Clause 19, we add: communication does not constitute instigation if it relates to a matter of public interest.
These amendments do not weaken the Bill. They sharpen it. They ensure the Commissioner’s powers are used to protect victims, not to chill legitimate speech. They prevent this Bill from inadvertently silencing criticism, investigative journalism, or public-interest disclosures.
The third set of amendments establishes independent oversight.
The Bill establishes an internal appeal mechanism. Clause 60 creates an Appeal Panel whose members are appointed by the Minister. Clause 63 provides the right to appeal Commissioner decisions to an Appeal Committee drawn from this Panel. Crucially, Clause 63(5) and (6) state that “no further appeal[s]” will be permitted beyond the Appeal Committee. This makes a Ministerially-appointed committee the final arbiter.
Our amendment deletes Clause 63(5) and (6) and inserts a new Clause C establishing a right of appeal to the General Division of the High Court.
The proposed appeal is not unlimited. It is confined to three grounds: a point of law, that the harmful activity did not occur, or that compliance is not technically feasible. Now this is similar to the appeal mechanism in POFMA. This ensures the courts do not become a general review body for every single Commissioner decision, but remain available as an independent check on questions of legality, fact, and feasibility.
Mr Speaker, this is not about distrusting the Commissioner or the Minister. This is about institutional design. When the state exercises coercive power—power that could affect livelihoods, reputations, and businesses—there must be a route to independent judicial appeals of cases.
So some may say that our proposed amendments may introduce additional burdens on the courts. I also appreciate MOS Rahayu and Minister Tong’s earlier clarifications as to the policy considerations at play here.
MOS Rahayu referred to the inherent right of judicial review, which is still available as there is no ouster clause within the Bill. We appreciate that judicial review is always available, but the scope is generally limited to the process of the Commission in making the decisions and not about the merits of the case itself. The proposed amendment we have tabled is very limited in scope. It proposes expanding this right of appeal to the courts, to three limited grounds. We understand that balance needs to be struck and we are striving to achieve that.
A brief reference to Minister Tong’s suggestions that there may be a David and Goliath situation when we avail of appeals to the courts. We believe that the existing provisions within the bill as drafted, notably section 63, subsection four, which provides that there is no automatic stay on directions even when an appeal process is proceeding, helps to ameliorate that concern. There will be no continuing harm to victims as the appeals process is proceeding. Furthermore, when we avail of the courts, there are mechanisms that can address some of these concerns as well, such as in-camera, private proceedings, as well as gag orders to protect the identities of victims.
Clarifications
Mr Speaker, beyond these three amendments, I turn now to six areas where the Bill requires clarification from the Minister.
First, the scope of doxxing.
Clause 10 defines doxxing as publishing identity information where “a reasonable person would conclude was likely to have been intended to cause harassment, alarm, distress or humiliation” to the victim.
Here is my concern: A victim identifies their hitherto anonymous harasser online to warn others in the community. Could a reasonable person conclude this was “intended to cause” the harasser distress or alarm?
The Bill does not answer this, so I ask the Minister: Will victims who expose their harassers in this manner be caught by Clause 10? If not, what safeguards exist in the Bill’s design to prevent this?
Second, standing to appeal when directions are given to platforms.
Clause 28 sets out who receives Part 5 directions. Some directions—stop communication, restraining—can be issued directly to the communicator. But others—access disabling, account restriction, engagement reduction—are issued to platforms or administrators. When a direction is not issued to the communicator directly, can they still appeal? Clause 61(1)(e) allows the recipient to appeal. But the communicator is not the recipient. Clause 61(1)(f) says they may appeal only if they fall within a description the Minister “may” prescribe under Clause 82.
I ask the Minister: Will regulations be made to ensure that communicators have standing to appeal directions that restrict their content, even if those directions were not issued directly to them? Or will this remain subject to Ministerial discretion?
Third, the meaning of “prescribed connection to Singapore”.
I understand MOS Rahayu has addressed this example. She has given the example of long-term residents in Singapore who will fall under the ambit of this provision. We would like to seek further clarifications. It is clear that citizens, PRs, and long-term residents are eligible to make a report. We would like to understand what else the ambit of prescribed connection in Singapore could mean. Does it cover a foreign spouse on a Long-Term Visit Pass, harassed by someone in Singapore? Does it cover a former Singapore resident, who has now moved overseas, who is still being targeted by the Singapore-based individual?
I ask the Minister: Who is protected by this Bill? Who is excluded? What will “prescribed connection” mean when the regulations are eventually drafted?
Fourth, the exemption for public agencies.
Clause 4(2) states that public agencies cannot be given directions or orders under Part 5. Part 5 contains the Commissioner’s enforcement powers—stop communication directions, access disabling directions, restraining directions, and so on. This means that if harmful content originates from, is hosted by, or is facilitated by a public agency, the Commissioner cannot compel that agency to act.
Clause 4(3) goes further: public agencies cannot be sued under the civil proceedings provisions in Parts 10, 11, and 12.
Mr Speaker, online harm is online harm, regardless of its source. A citizen harassed through content on a government platform, or by a government account, experiences the same distress as one harassed on a private platform or by a private account.
I ask the Minister: Why are public agencies exempt from the Commissioner’s enforcement powers and from civil liability? What is the policy rationale for this asymmetry? What recourse will an individual have, if they experience harassment from a rogue public employee, using an official account?
Fifth, engagement reduction and class-of-material directions.
Clause 40 grants the Commissioner power to issue “Engagement Reduction Directions”. This allows the Commissioner to require a service provider to reduce the engagement of end-users with a class of material—without removing it. Clause 41(3) explicitly states it is “not necessary to give any person who may be affected by a Part 5 direction an opportunity to be heard before the direction is given.”
This means content posters are not necessarily informed. The content stays online, visible to the poster. It also remains visible to the victim. None will be aware that the content’s reach has been throttled. The victim will continue to see the harm, wondering if their report achieved anything at all.
Mr Speaker, this is a shadow ban.
So my question to the Minister is simple: What is the envisaged use of this particular direction? Surely, in cases of legitimate, serious harms, one of the other more forceful directions is the better solution for victims? And if this is meant as a “less forceful” direction, why does the commissioner see the need to intervene at all?
Clause 40, along with clauses 30 and 33, is also a “class of material” direction that raises a related concern. These allow the Commissioner to issue stop communication directions against entire categories of content identified by “specific identifiers”—a username, a term, an online location. There is a high risk that this will function as a digital dragnet that may capture legitimate content alongside harmful material.
For example, a victims’ advocacy group reporting on an emerging category of new online harms as a warning to the community, may see their material swept up by mistake.
On class-of-material directions, I have two questions: What safeguards exist to prevent over-blocking? And what remedy exists for users whose legitimate content is caught by these directions?
Sixth, and finally, consistency in decision-making.
Parts 10 to 12 of this Bill establish statutory torts that will be adjudicated by courts. These will generate judicial precedents and published decisions. There will be transparency and consistency through the common law process.
But Part 5 directions—which may affect just as many people—are quasi-judicial decisions in nature. There is no requirement for published reasons or directions. There is no public record of how the Commissioner interprets and applies the definitions of harms in Part 3 and other sections.
I ask the Minister two questions:
First: Will the Commissioner be bound by prior decisions that they have made, or will each case be decided on fresh discretion?
Secondly: Will the Commissioner be bound by clarifications given today in this House? If the Minister states today that Clause 10’s doxxing definition does not capture victims exposing their harassers, can future Commissioners be held to that clarification?
Conclusion
Mr Speaker, this Bill introduces fundamental protections for victims of online harm. We support these protections.
Our amendments ensure those powers are exercised on evidence, with safeguards for legitimate speech, and with independent judicial oversight. But oversight is meaningless if those affected cannot access it.
The Bill’s ambiguities—on doxxing, on standing to appeal, on coverage, on public agency exemptions, on shadow bans, on consistency in decision making—are not minor. They go to the heart of who is protected, who is excluded, and whether this regime operates fairly. I hope that in the course of this debate, that we get clarity and comfort on these questions.
I urge the Minister to accept our amendments. If the Minister declines, I ask for clear answers to the questions I have raised. Singaporeans deserve a regime that protects victims and respects fairness in equal measure.


