The artificial intelligence (AI) boom is often framed as a race for compute power, talent and investment. But beneath the surface, a different constraint is emerging; one that is far less visible and harder to scale. Energy.
The rapid expansion of AI infrastructure globally is beginning to test the limits of power grids, water systems and public tolerance.
ASEAN Vice President and General Manager at Hitachi Vantara.
Data centers, once treated as neutral enablers of the digital economy, are now at the center of a growing tension between technological ambition and physical reality. Southeast Asia is not yet at the center of that tension. But it is moving quickly in that direction.
Across the region, governments are positioning themselves as the next hubs for AI infrastructure. Investments are pouring into data centers, semiconductor ecosystems and AI-enabled industrial zones.
Global cloud providers are expanding aggressively, drawn by policy support, improving connectivity and proximity to fast-growing markets. For now, the momentum looks like a success story.
But the question worth asking is not how fast Southeast Asia can build. It is how long that pace can be sustained.
The invisible cost of AI
AI’s energy demands are often discussed in abstract terms through efficiency gains, optimization, performance improvements. What is less often acknowledged is how quickly those demands scale in practice.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Training large models requires enormous bursts of compute. Running them at scale requires constant, sustained power. Modern AI data centers operate at densities far beyond traditional facilities, with cooling systems alone consuming a significant share of total energy use.
The result is that AI is no longer just a software or infrastructure story. It is an energy story.
In some markets, that reality is already becoming difficult to ignore. Rapid data center expansion has contributed to grid congestion, rising electricity costs and growing scrutiny over water consumption.
In a few cases, it has also triggered local resistance not because communities oppose technology, but because they are being asked to absorb its physical footprint. These are not distant risks. They are early signals.
Southeast Asia’s moment — and its dilemma
Southeast Asia is entering this phase with a different set of constraints. Many countries in the region are already balancing rapid urbanization, industrial growth and rising energy demand. IT infrastructure planning is often stretched across competing priorities.
In that context, the arrival of AI-scale workloads is not incremental but additive pressure on systems that are already under strain.
At the same time, Southeast Asian countries are racing to attract hyperscale investment, offering incentives and accelerating approvals. What has emerged is something close to an AI infrastructure gold rush and one where speed is increasingly seen as an advantage.
But speed on its own does not resolve constraints. It tends to expose them. There is already a divergence in how countries are approaching this.
In Singapore, policymakers have taken a more deliberate path. A pause on new data center developments several years ago was not a retreat, but a recalibration.
Growth has since resumed, but with tighter controls — prioritizing efficiency, low-carbon operations and closer alignment between digital infrastructure and energy planning.
Singapore remains limited by land, energy imports and grid capacity. But it has reframed the challenge: expansion is possible, but only within clearly defined boundaries.
Elsewhere, the picture is less settled. In Malaysia, investment momentum has accelerated rapidly, particularly in Johor and Cyberjaya. The country is emerging as a regional infrastructure hub, supported by connectivity advantages and strong policy support.
Yet alongside that growth, concerns around electricity tariffs, water usage and long-term grid capacity are becoming harder to ignore.
Neither approach is inherently right or wrong. But they point to the same underlying tension: growth is accelerating faster than the systems needed to support it.
Building more is not the same as building better
Much of the current conversation around AI infrastructure is still framed in terms of scale such as how much capacity can be added, how quickly, and at what cost. But scaling AI is not simply a matter of building more data centers. It requires building differently.
That shift starts with recognizing that data centers are no longer isolated assets. They are part of a broader energy ecosystem, where compute demand, power availability and cooling efficiency are tightly linked. Treating them as independent units — optimized for performance alone — risks creating inefficiencies that ripple beyond the facility itself.
It also requires a level of coordination that has not traditionally existed. Energy providers, infrastructure developers and technology operators have often worked in parallel, each optimizing for their own objectives. AI infrastructure collapses those boundaries. Its performance depends on how well these systems are integrated.
And then there is the question of discipline. For the past two years, much of the focus has been on scaling compute. But as AI systems become more embedded in operations, the limiting factor is increasingly data — its quality, governance and accessibility.
Without stronger data foundations, more compute does not necessarily translate into better outcomes. It simply amplifies inefficiency.
The cost of ignoring the constraints
If these issues remain secondary, they are unlikely to stay contained.
Grid instability, rising operational costs and environmental pressure tend to surface in ways that are difficult to manage after the fact. In some markets, this has already translated into project delays, regulatory pushback and growing public resistance.
Southeast Asia has not yet reached that point. But it is close enough to see how it unfolds elsewhere.
The region still has an advantage: it can learn from those experiences rather than replicate them. It can recognize that AI infrastructure is not just an economic lever, but a system that interacts with national resources in complex ways. That recognition, however, requires a shift in priorities.
A different measure of success
The next phase of AI growth is unlikely to be defined by who builds the most capacity. It will be defined by who can sustain it.
For Southeast Asia, that means treating energy, data and infrastructure as interdependent, not separate domains. It means accepting that some constraints cannot be engineered away, only managed more intelligently.
And it means acknowledging that long-term competitiveness will depend as much on efficiency and resilience as it does on scale.
The region’s AI ambitions are justified. The opportunity is real. But ambition without constraint tends to create fragility.
The more useful question, then, is not how quickly Southeast Asia can catch up in the AI race. It is whether it can avoid the limits that others are only now beginning to confront.
Because in the end, the challenge is not building the future. It is building one that holds.
We've featured the best IT management tools.
This article was produced as part of TechRadar Pro Perspectives, our channel to feature the best and brightest minds in the technology industry today.
The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/pro/perspectives-how-to-submit
ASEAN Vice President and General Manager at Hitachi Vantara.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.