Parliament
Speech by He Ting Ru On An Artificial Intelligence (AI) Transition with No Jobless Growth Motion

Speech by He Ting Ru On An Artificial Intelligence (AI) Transition with No Jobless Growth Motion

He Ting Ru
He Ting Ru
Delivered in Parliament on
6
May 2026
5
min read

Mr Speaker Singapore’s approach to AI is often cited by international institutions and consultancies like BCG, and prominent figures such as IMF Managing Director Kristalina Georgieva. Our technological infrastructure and initiatives to upskill our workers are key parts of how we plan to confront the disruptions and opportunities presented to us by this new and rapidly developing technology.

Mr Speaker

 

Singapore’s approach to AI is often cited by international institutions and consultancies like BCG, and prominent figures such as IMF Managing Director Kristalina Georgieva. Our technological infrastructure and initiatives to upskill our workers are key parts of how we plan to confront the disruptions and opportunities presented to us by this new and rapidly developing technology.

 

However, we must also recognise and act on an additional, uncomfortable reality: Singapore is one of the most vulnerable economies to AI disruption. International estimates suggest that around 60 per cent of workers in advanced economies are in jobs that are highly exposed to AI. For Singapore, that share is significantly higher. Because we are a high‑skill, services‑oriented hub, estimates from the IMF show that approximately 77 per cent of our local workforce is highly exposed to AI disruption, and our transition is likely to be sharper and more acute than in many other economies.

 

How is AI fundamentally reconfiguring our labour market? This can be understood through three distinct shifts: 


a.                         First, many existing jobs will be transformed from within. AI is taking or already has taken over routine, information‑processing tasks: drafting, summarising, extracting data, and standardised analysis. Managers, health professionals and legal professionals are already using AI tools to handle these types of tasks, freeing up time for judgement, complex problem‑solving and human interaction. 


b.        Second, some jobs will be displaced. In what economists call high‑exposure, low‑complementarity roles, AI can perform most of the core tasks on its own, and there are fewer reasons to keep humans in the loop. Clerical support workers, and many business and administration associate professionals – whose work is built around routine documentation, basic processing and standardised customer queries – face the highest risk that their roles will shrink or even disappear. Advances in agentic AI technology and models have only sharpened this impact.  In the UK, some financial institutions like investment banks have relooked their hiring of fresh graduates for certain roles because of AI’s automation capabilities. 


c.        Third, AI will also create new jobs and new demands. We are already seeing rising demand for AI engineers, data scientists and AI product specialists, but also for data‑savvy professionals across finance, healthcare, logistics and education. These new roles tend to offer higher wages – but only for workers who can supply the right mix of technical and complementary human skills.

 

Yes indeed, the AI job transformation is already here, and we are in the midst of a major disruption. Yet, the impact will not be uniform across all professions, nor is it and will it affect our society and economy evenly.

 

For now, AI disruption is strongest amongst white-collared workers, especially entry-level roles. Unlike previous technological disruptions that have historically affected blue-collared jobs, AI today will most affect cognitive, white-collared roles. A call-centre agent, an admin officer, or a junior business support executive whose work day is built around standard processes, routine reports, and scripted responses is in a role where AI can perform almost all core tasks. In such high-exposure, low-complementarity white-collar roles, employers can consolidate positions, slow hiring, or redesign jobs to ensure that fewer people are expected to do more, with AI as a simple justification.

 

If we do not address this, the benefits of AI will end up with only a small group of workers. Research suggests that productivity and wealth gains could disproportionately accrue to those best positioned to leverage AI capabilities. One documented economic effect of high-skill job creation is increased local service demand: studies from major tech hubs, including San Francisco, indicate that each high-tech job is associated with the creation of approximately four jobs in local service sectors such as retail and food services.

 

Even if such spillover effects generate more jobs, the quality and availability of these jobs for vulnerable workers is less certain to me. In Singapore, lower-wage and routine-intensive roles are more likely to be held by vulnerable worker groups, who may also face greater displacement risk from automation. International institutions, including the IMF and World Bank, have noted that AI could exacerbate income inequality in the absence of policy intervention. The extent to which spillover effects from AI-driven growth would benefit lower-income workers remains uncertain. We need Singapore-specific research modelling these distributional impacts, and make this data publicly available to inform more targeted policy responses.

 

We must also remember that Singaporeans are already feeling the strain of rising property prices and higher costs for essential services. These pressures are real. They have been building for some time as we are a small, open economy significantly dependent on capital inflows. Would AI’s effects drive further unequal wealth accumulation? It is therefore a fair and pressing question to ask: could AI-driven economic activity add to daily pressures?

 

Supporting Vulnerable Groups During AI Job Transformation

 

Beyond broad economic pressures, we must turn towards the human face of this transition. As jobs continue to be reshaped and workers continue to be upskilled, we cannot leave behind those who face systemic barriers as our nation progresses towards an AI-ready future. Amongst them are: persons with disabilities, women, lower-income Singaporeans, and as well as young graduates: 

 

a.                         AI can introduce new forms of discrimination against persons with disabilities. As AI algorithms are often trained via pattern recognition, they arrive at determinations based on common patterns within datasets. Thus, if skewed historical data is being used to train AI for recruitment processes, AI might reinforce this bias for job applications from persons with disabilities, and any other group which historically is not well represented in the space.

 

b.                        Female workers too face a heightened risk of marginalisation from AI. A 2024 IMF report on Singapore's labour market found that women are underrepresented in AI-intensive STEM roles and among workers with AI engineering skills. Women in STEM held 29% of entry-level positions, 24.4% managerial positions, but only 12.2% of C-suite roles. Altogether, this means they are less represented in what is regarded as the “safe” side of AI. They would thus be less well positioned to benefit where AI complements high-skill work. Additionally, International Labour Organisation (ILO) data released in March 2026 found that occupations dominated by women are nearly twice as likely to be exposed to GenAI compared with male-dominated ones, with even stronger differences presenting when looking at high automation risk. Taken together, this creates a double disadvantage: female workers are less likely to gain from AI's benefits, while remaining more vulnerable to displacement. In short, they face higher risks AND have fewer opportunities.

 

c.                         For our young graduates, new uncertainty has been introduced by the way AI has been reshaping jobs. The erosion of entry-level jobs has presented a catch-22 dilemma for Gen Zs -- while companies are still looking to hire professionals for experienced roles, young graduates have fewer opportunities to gain such experiences as jobs are absorbed by AI. With more than 20% of graduates unable to secure full-time permanent roles in 2025, a nearly 5% increase from 2023, and with 60% of graduates claiming that job search has become difficult, it is only natural that young graduates have become more anxious about landing full-time employment. 

 

Adding to this is recent research which has shown that simply being aware of AI’s potential to augment or threaten one’s job can increase burnout, mainly by heightening job insecurity and emotional exhaustion amongst workers. 


While AI is often associated with disruption to white-collar work, vulnerable workers and families face significant risks too. Unequal access to AI tools and training could entrench existing disadvantages. Those without the resources or home environments conducive to learning new skills may find themselves falling further behind. If left unaddressed, this risks hardening inequality across generations.

 

Measuring Impact on Singapore Beyond Economic Output

What does all this mean for Singapore? Just this past week, we have seen the launch of the “Marriage and Parenthood Reset Workgroup”. What is the effect of AI advances on our birth rate? Economic insecurity has already been cited by young Singaporeans as a reason for delaying or forgoing parenthood. But the barriers go beyond finances. Job uncertainty erodes a sense of stability and confidence in the future, the feeling that one has a firm enough footing to build a family and put down roots. If AI-driven disruption deepens this broader sense of insecurity, we can reasonably expect further downward pressure on our already tragically low Total Fertility Rate.

 

The government has to be even more targeted in ensuring that all workers, regardless of their gender, age, occupation, income, and accessibility needs, are fully prepared for the disruption caused by AI to ease financial pressures on vulnerable workers who are made redundant. This will minimise the uncertainty and toll of unemployment on both workers and their families as AI displacement becomes more commonplace. 

 

To ensure our policies are working, we need more public data for us to measure AI-driven disruptions on our labour market.

 

For example, how we measure the success of our AI programmes. Following up on the response to my Parliamentary Question on 24 February this year, I noted that the AI Apprenticeship Programme (AIAP) is currently assessed through three primary indicators: 

 

a.                         One, the total number of practitioners trained,

b.                        Two, the percentage of AIAP graduates who took up AI-engineering related roles, and

c.                         Three, the completion and supervision of project quality.

 

It is a good start. But they do not tell us the effects of AI programmes and disruptions on different groups in society. These measures focus on throughput, rather than equity. We need data on wage trajectories, job quality, and retention in AI roles, 2-3 years after the programme is completed. We have to measure more. 

 

First, the participant data profile should have more details made public. This can include previous occupations, income band before training, age, gender, education and disability status. This allows us to see where participants come from: high-exposure, low-complementarity roles, or already high-complementarily roles. This will inform if vulnerable groups are even putting their feet through the door. 

 

Next, more accurately measuring AI disruption in the wider labour market can come in the form of exposure-complementarity mapping, thus understanding whether jobs are high-exposure and low-complementarity, and to establish a tuned framework to track displacement, wage changes and job quality across demographic groups. Such data gives the government a clearer picture of how AI is affecting different communities, so that support can be directed where it is most needed.

 

Empowering Young Entrepreneurs 

 

Now I will turn to some thoughts on how our youth can address the challenges of AI. If AI displaces a significant share of entry-level roles, young workers may find fewer opportunities to build the foundational experience traditionally needed to progress into senior positions.

 

One of our nation’s solutions could be to better encourage and support entrepreneurship amongst youth. This will allow them to also gain valuable skills independently rather than wait to be picked up to be employed by an established firm. This approach builds on an already open door: AI has already greatly reduced barriers to starting a business by being deployed to build websites, analyse data, run marketing, and even automate back-office tasks.

 

We have many schemes for startups, such as grants and bootcamps, but do these initiatives adequately provide sustained, long-term support across the full life-cycle of a burgeoning firm? Moreover, our grant architecture remains milestone-heavy and programme-bound, encouraging compliance over competition. We need a culture and framework that recognise the value of a failed startup, or that support founder-led networks over time. Drawing on lessons from other entrepreneurial hubs, there are two major gaps that have inhibited Singapore’s ability to establish a sustainable ecosystem conducive for entrepreneurs.

 

First, we must build sustained, informal networks that make an entrepreneurship culture self-sustaining. Our current networks are often programme-based and time-limited, skewed towards short-term coaching. Yet, research shows that informal mentorships arising from mutual choice and affinity are far more effective than administrative matching. If mentorship is only linked to short-term grants, our youth may struggle to  gain the trust-based guidance seen in Silicon Valley:

 

In leading entrepreneurial hubs like Silicon Valley and Shenzhen, informal founder networks have been a critical but often overlooked driver of success. They enable knowledge-sharing, supply-chain connections, and the spin-out of new ventures from anchor firms.

 

Singapore has much to learn from this. While we have anchors like Grab and BLOCK71, the Asian Development Bank has noted that our ecosystem trails others because our collaboration remains policy-driven rather than organically-clustered. How can we reduce administrative burden on founders to ensure that they do not become overly occupied with meeting grant milestones instead of establishing the market competitiveness they need to survive AI-driven disruption? One possibility is to limit formal reporting to end-of-grant rather than more regularly to strike a balance.

 

Singapore must better leverage our anchor firms. Companies like Grab, Sea, and Singtel hold deep reservoirs of technical expertise and industry networks that largely remain locked within the firm. 

 

Could we use targeted tax credits or co-investment matching for peer development programmes, to encourage anchor firms to run structured mentorship and spin-out programmes for early-stage founders? This will allow organic networks to form around existing reservoirs of excellence, rather than hope that government grant cycles will do so. The private sector must lead, and the government's role should shift from convenor and gatekeeper to catalyst. This is how we can start to grow our entrepreneurial system from within industry.

 

We must also learn to value failure. Singapore’s culture of academic emphasis and social conformity makes us afraid to fail. A 2018 PISA study by the OECD found that Singapore students expressed a greater fear of failure than their peers in any other participating country. Yet entrepreneurship means being tolerant of failure. Founders have to make decisions with incomplete information, and meaningful innovation has to be backed up by some freedom to fail. We have to treat failure as a stepping stone rather than a stigma, or we end up stifling the ecosystem we are trying to build, and leave our youth ill-equipped to flourish in an age of disruption. 

 

We can do so by beginning our own transition towards a better space for entrepreneurship. Encourage experimentation and normalise entrepreneurial failure as growth and experience. Failure should be a stepping stone, not a dead end. And we begin this transition within schools, where we have to move away from “perfect scores”. Where we have  entrepreneurial projects in schools which expose students to the inner workings of a startup, we should also showcase failed projects for their boldness. 

 

Singapore's current bankruptcy framework can be adapted to better support entrepreneurs. Currently, founders who fail face the same restrictions as any other bankrupt with travel bans, director disqualifications, and no automatic discharge, regardless of whether their failure was the result of genuine risk-taking or financial misconduct. Could we have a dedicated pathway for bona fide startup failure: one that allows founders to be discharged sooner, resume directorship more quickly, and have their experience recognised as something valuable, rather than a liability. It is not to make failure consequence-free, but to ensure that the cost of an honest bet gone wrong does not permanently deter our most enterprising young Singaporeans from trying again.

 

Our Role in Global and regional AI-stewardship

Finally, it is also my hope that we use our experiences navigating the AI transition to play a regional and global role as other economies too attempt to navigate the disruption. Singapore comes from a place of strength, and we already are intentionally deciding to lead the way when it comes to setting the agenda in global AI governance. 

 

Our stewardship role must extend beyond frameworks, and we have to play our part in addressing global imbalances in AI-development and use, reflected in recent data. World Bank 2025 data show that high-income countries account for 87% of notable AI models, 86% of AI start-ups, and 91% of venture capital funding – despite representing just 17% of the global population. There is justifiable concern about how vulnerable groups and the global south are woefully under-represented in the AI space. As responsible world citizens, we can do our part to address this. 

 

Regionally, we have already begun developing AI tools tailored to Southeast Asian languages through Project SEA-LION, recognising that much of the developing world risks being left behind by AI systems built on Western data. We should build on this by championing equitable AI access across ASEAN, exporting our governance expertise to nations that lack the capacity to develop their own frameworks, and ensuring that the rules governing AI reflect not just the interests of the powerful, but the needs of the many.

 

This is not merely an abstract foreign policy ambition. It has direct consequences for jobs here at home. Singapore's standing in the global AI ecosystem gives us leverage to shape how AI tools are built, deployed, and adopted across the region. We should use that leverage intentionally. When our researchers develop AI systems that work across Southeast Asian languages, we create tools that can be deployed in our own service sectors, our hospitals, our schools. When our companies lead in AI adoption, we generate demand for new skills, new roles, and new industries that our workers can be trained into. We must ensure Singaporeans are in the room where these technologies are being built, and not merely be on the receiving end of decisions made elsewhere. Our global AI leadership is ultimately an investment in ensuring that the answer is the former. 

 

This approach also has the added benefit of creating more jobs and opportunities for Singaporeans in what is then a true trickle-down effect. 

 

Thank you.

Categories
 
Back to top
Workers' Party members working hard to set up a GE2025 rally

Walk with us, #StepUp with the Workers’ Party

Join us in building a brighter future for all Singaporeans. Whether you lend your time, energy, or resources, your support makes a difference.