Korean startup backed by Samsung and Arm launches rack-sized inference monsters, claims "6x lower power consumption" and up to 75% cheaper acquisition cost compared to Nvidia

A representative abstraction of artificial intelligence
(Image credit: Shutterstock / vs148)

  • Rebellions launches modular AI systems designed for scalable data center deployment
  • RebelRack operates as a single production-ready rack for AI workloads
  • RebelPOD scales infrastructure into clustered deployments for larger enterprise workloads

Rebellions has introduced two rack-scale inference systems, RebelRack and RebelPOD, extending its platform beyond chip design into fully deployable infrastructure.

These systems are designed to run artificial intelligence workloads directly within data center environments, combining hardware and software into integrated units.

RebelRack operates as a single production-ready rack, while RebelPOD scales this model into clustered deployments intended for larger workloads.

Article continues below

Performance and cost claims draw scrutiny

The company claims these systems deliver “6x lower power consumption” and up to 75% lower acquisition costs compared to Nvidia.

These claims focus on efficiency at the system level, where power usage and total cost of ownership have become central concerns for operators.

While such figures suggest meaningful reductions, they depend on workload conditions and deployment environments, which may vary across use cases.

The infrastructure is built around the Rebel100 neural processing unit and supported by a cloud-native software stack designed for production environments.

The platform integrates with widely used frameworks, including PyTorch and Kubernetes-based systems, allowing deployment across different models and infrastructure setups.

The launch reflects a broader shift in the artificial intelligence sector, where the ability to run models efficiently is becoming as important as developing them.

“AI is now measured by its ability to operate in the real world — at scale, under power constraints, and with clear economic return,” says Sunghyun Park, Co-Founder and CEO of Rebellions.

Data center operators are increasingly constrained by power availability and infrastructure limits, creating demand for systems that can deliver performance within those boundaries.

Rebellions is accelerating its international presence, with a focus on the United States, where demand for deployable AI infrastructure is rapidly growing.

The company aims to provide ready-to-use systems that integrate seamlessly with existing operations, allowing organizations to deploy AI workloads without long setup periods.

The firm emphasizes end-to-end support, combining hardware, validated software, and ongoing operational assistance to ensure production reliability.

This approach aims to reduce integration challenges often encountered in data centers running diverse AI workloads.

The recent $400 million pre-IPO funding round, led by Mirae Asset Financial Group and the Korea National Growth Fund, will be used to expand manufacturing capacity and strengthen supply chains.

This round brings Rebellions’ total valuation to approximately $2.34 billion, reflecting investor confidence in its strategy and market potential.


Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.

Efosa Udinmwen
Freelance Journalist

Efosa has been writing about technology for over 7 years, initially driven by curiosity but now fueled by a strong passion for the field. He holds both a Master's and a PhD in sciences, which provided him with a solid foundation in analytical thinking.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.