As AI data centers hit power limits, Peak XV backs Indian startup C2i to fix the bottleneck

As AI data centers hit power limits, Peak XV backs Indian startup C2i to fix the bottleneck

Power Play: India’s C2i Semiconductors Aims to Revolutionize AI Data Centers with 10% Energy Savings

In a bold move that could reshape the economics of artificial intelligence infrastructure, Indian semiconductor startup C2i Semiconductors has secured $15 million in Series A funding to tackle one of tech’s most pressing challenges: power efficiency in AI data centers.

The Bengaluru-based company, whose name stands for “Control, Conversion, and Intelligence,” is developing plug-and-play, system-level power solutions designed to slash energy losses that currently plague large-scale AI infrastructure. The funding round was led by Peak XV Partners (formerly Sequoia Capital India), with participation from Yali Deeptech and TDK Ventures, bringing C2i’s total funding to $19 million since its founding in 2024.

The Power Problem Behind AI’s Growth

As artificial intelligence models grow increasingly complex and data centers expand to meet demand, a fundamental shift is occurring in what limits AI infrastructure scalability. While compute power once dominated concerns, power delivery has emerged as the critical bottleneck.

“What used to be 400 volts has already moved to 800 volts, and will likely go higher,” explained Preetam Tadeparthy, C2i’s co-founder and CTO, in an exclusive interview with TechCrunch. This voltage escalation reflects the intensifying pressure on power systems as AI workloads demand more energy than ever before.

The numbers paint a stark picture. According to a December 2025 BloombergNEF report, data center electricity consumption is projected to nearly triple by 2035. Goldman Sachs Research estimates an even more dramatic surge—data center power demand could increase by 175% by 2030 compared to 2023 levels, equivalent to adding another top-10 power-consuming country to the global grid.

The Hidden Energy Drain

The inefficiency problem lies in the complex process of converting high-voltage power to levels usable by GPUs and other processors. Currently, this conversion process wastes approximately 15% to 20% of energy through multiple transformation stages, heat dissipation, and conversion losses.

C2i’s founders—Ram Anant, Vikram Gakhar, Preetam Tadeparthy, Dattatreya Suryanarayana, Harsha S. B, and Muthusubramanian N. V—bring deep expertise from Texas Instruments, where they specialized in power management solutions. Their insight: by treating power delivery as an integrated system rather than a series of discrete components, significant efficiency gains become possible.

A System-Level Solution

C2i’s approach is revolutionary in its simplicity and ambition. Rather than optimizing individual components, the company is redesigning power delivery as a single, plug-and-play “grid-to-GPU” system that spans from the data center’s main power bus all the way to the processor itself.

“By treating power conversion, control and packaging as an integrated platform, C2i estimates it can cut end-to-end losses by around 10%—roughly 100 kilowatts saved for every megawatt consumed,” Tadeparthy explained. This translates directly to reduced cooling costs, improved GPU utilization, and better overall data center economics.

The impact on total cost of ownership is substantial. “All that translates directly to total cost of ownership, revenue, and profitability,” Tadeparthy emphasized. In an industry where energy costs become the dominant ongoing expense after initial capital investment, even incremental efficiency gains carry enormous value.

Why Peak XV Partners Bet Big

For Peak XV Partners, the investment represents a strategic bet on how power costs will shape AI infrastructure economics at scale. Rajan Anandan, the venture firm’s managing director, sees energy efficiency as the next frontier for competitive advantage in the AI race.

“If you can reduce energy costs by, call it, 10 to 30%, that’s like a huge number,” Anandan told TechCrunch. “You’re talking about tens of billions of dollars.” This perspective reflects the massive scale of the opportunity—as AI infrastructure expands globally, even small percentage improvements in efficiency compound into enormous financial and environmental benefits.

The timing of the investment is particularly significant. As data centers face increasing scrutiny over their environmental impact and energy consumption, solutions that can deliver both economic and sustainability benefits are becoming essential rather than optional.

From Silicon to Scale

C2i’s journey from concept to commercialization is moving at impressive speed. The company expects its first two silicon designs to return from fabrication between April and June 2026, after which it plans to validate performance with data center operators and hyperscalers who have already expressed interest in reviewing the technology.

With a team of approximately 65 engineers based in Bengaluru, C2i is establishing customer-facing operations in the United States and Taiwan to prepare for early deployments. This geographic expansion reflects the global nature of the AI infrastructure market and the need to be close to major data center operators.

The Challenge of Disrupting an Entrenched Industry

Power delivery represents one of the most established segments of the data center technology stack, long dominated by large incumbents with deep pockets and lengthy qualification cycles. C2i’s approach of redesigning power delivery end-to-end requires coordinating silicon design, packaging innovation, and system architecture simultaneously—a capital-intensive strategy that few startups attempt.

The complexity is compounded by the need for extensive validation. Data center operators are notoriously conservative about adopting new technologies, particularly those affecting power delivery where reliability is paramount. C2i must prove not only that its solutions work in theory but that they can deliver consistent performance under the demanding conditions of production data centers.

India’s Semiconductor Moment

C2i’s emergence also signals a maturing of India’s semiconductor design ecosystem. Anandan draws a compelling parallel to another technology revolution: “The way you should look at semiconductors in India is, this is like 2008 e-commerce. It’s just getting started.”

This comparison reflects several converging factors. First, India now boasts a deep pool of engineering talent, with a growing percentage of global chip designers based in the country. Second, government-backed design-linked incentives have lowered the cost and risk of tape-outs, making it increasingly viable for startups to build globally competitive semiconductor products domestically rather than operating solely as captive design centers.

The success of companies like C2i could catalyze further investment in India’s semiconductor ecosystem, potentially positioning the country as a major player in the global chip design industry. This would represent a significant shift from India’s traditional role as a services and support hub to a center of semiconductor innovation and product development.

The Six-Month Test

Despite the promising technology and strong backing, significant challenges remain. Power delivery is a capital-intensive business requiring substantial investment in R&D, manufacturing partnerships, and customer acquisition. The qualification cycles for data center components can stretch for years, and competition from established players with deeper pockets and broader customer relationships is intense.

Anandan acknowledges these risks but sees a relatively short feedback loop for C2i’s approach. “We’ll know in the next six months,” he said, pointing to the upcoming silicon validation and early customer feedback as the moment when the investment thesis will be truly tested.

The stakes are high not just for C2i but for the broader AI infrastructure ecosystem. If successful, C2i’s technology could help address one of the most significant barriers to sustainable AI growth—the spiraling energy demands of increasingly powerful models and larger data centers.

The Broader Implications

Beyond the immediate business opportunity, C2i’s work touches on fundamental questions about the sustainability of AI development. As models grow larger and training runs become more energy-intensive, the industry faces a critical choice: continue scaling power consumption or find ways to do more with less.

C2i represents the latter approach—using clever engineering and system-level thinking to extract more value from every watt of power. If successful, this could help AI development continue its rapid pace while mitigating some of the environmental concerns that have begun to shadow the industry.

The company’s success could also influence how future data centers are designed, potentially shifting the focus from raw compute density to more holistic considerations of power efficiency and total system performance. This could have ripple effects throughout the tech industry, influencing everything from chip design to facility architecture to the economics of cloud computing.

Looking Ahead

As C2i prepares to ship its first silicon and engage with potential customers, the tech industry will be watching closely. The company’s technology addresses a problem that affects every AI deployment, from cutting-edge research labs to commercial cloud services, making its potential impact both broad and deep.

The next six months will be crucial. If C2i can demonstrate the efficiency gains it claims and convince data center operators to adopt its technology, it could establish itself as a key player in the AI infrastructure stack. Failure would be equally instructive, potentially highlighting the challenges of disrupting entrenched industries even with superior technology.

Either way, C2i’s journey reflects the evolving nature of the AI industry itself—an ecosystem where power, not just compute, has become the limiting factor in scaling the technology that many believe will define the coming decades.


Tags: AI infrastructure, data center efficiency, power management, semiconductor startup, energy savings, C2i Semiconductors, Peak XV Partners, Texas Instruments alumni, Bengaluru tech, grid-to-GPU solutions, silicon validation, hyperscaler partnerships, total cost of ownership, sustainability in AI, semiconductor design India

Viral Sentences:

  • “Power, not compute, is the new bottleneck in AI’s explosive growth”
  • “C2i promises to save 100 kilowatts per megawatt—that’s like finding a power plant in your data center”
  • “The $19 million bet that could save the AI industry billions in energy costs”
  • “India’s semiconductor moment: ‘This is like 2008 e-commerce—it’s just getting started’”
  • “From Texas Instruments to trillion-watt problem solvers”
  • “The six-month countdown to prove whether system-level power design can beat decades of incremental improvements”
  • “When AI needs its own power grid, efficiency becomes the ultimate competitive advantage”
  • “C2i’s plug-and-play solution might be the cheat code data centers didn’t know they needed”
  • “The founders who saw the future: voltage going from 400 to 800 to who-knows-what-next”
  • “Betting tens of billions that power efficiency will be AI’s next gold rush”

,

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *