AI Power Crunch: Indian Startup C2i Semiconductors Secures $3M to Tackle Data Center Limits
The relentless growth of Artificial Intelligence (AI) is placing unprecedented demands on data center infrastructure. While much of the focus has been on compute power – the GPUs and processors driving AI workloads – a new bottleneck is emerging: power. Scaling AI data centers is increasingly limited not by the ability to process information, but by the ability to reliably and efficiently deliver the massive amounts of electricity required. This shift has prompted significant investment in companies addressing this challenge, with Peak XV Partners leading a $15 million Series A round for C2i Semiconductors, an Indian startup pioneering plug-and-play, system-level power solutions designed to drastically reduce energy losses and improve the economics of large-scale AI infrastructure. This brings C2i’s total funding to $19 million.
The Growing Strain on Data Center Power
Data center energy demand is accelerating at an astonishing rate globally. A December 2025 report from BloombergNEF projects electricity consumption from data centers to nearly triple by 2035. Goldman Sachs Research estimates an even more dramatic surge, forecasting a 175% increase in data center power demand by 2030 compared to 2023 levels – equivalent to adding another top-10 power-consuming country to the global grid. This isn’t simply about generating more electricity; it’s about efficiently converting and delivering it within the data center itself.
The Inefficiency of Power Conversion
A significant portion of energy is lost during the conversion process inside data centers. High-voltage power must be stepped down thousands of times before it can be utilized by GPUs. Currently, this process wastes approximately 15% to 20% of energy, a substantial loss that directly impacts operational costs and sustainability efforts. “What used to be 400 volts has already moved to 800 volts, and will likely go higher,” explains Preetam Tadeparthy, co-founder and CTO of C2i, highlighting the increasing voltage demands of modern AI hardware.
C2i Semiconductors: A Grid-to-GPU Solution
Founded in 2024 by a team of seasoned power executives from Texas Instruments – Ram Anant, Vikram Gakhar, Preetam Tadeparthy, Dattatreya Suryanarayana, Harsha S. B, and Muthusubramanian N. V – C2i is taking a fundamentally different approach to power delivery. They are redesigning the entire system as a single, plug-and-play “grid-to-GPU” solution, encompassing the data center bus all the way to the processor itself. The company, whose name stands for Control, Conversion, and Intelligence, aims to eliminate inefficiencies inherent in traditional, fragmented power architectures.
By integrating power conversion, control, and packaging into a unified platform, C2i estimates it can reduce end-to-end losses by around 10% – translating to roughly 100 kilowatts saved for every megawatt consumed. This reduction has cascading benefits, including lower cooling costs, improved GPU utilization, and enhanced overall data center economics.
The Economic Impact of Power Efficiency
“All that translates directly to total cost of ownership, revenue, and profitability,” emphasizes Tadeparthy. For venture capital firms like Peak XV Partners (formerly Sequoia Capital), the economic implications are substantial. Rajan Anandan, a Managing Director at Peak XV Partners, points out that after the initial investment in servers and facilities, energy costs become the dominant ongoing expense for data centers. Even incremental efficiency gains can yield significant returns.
“If you can reduce energy costs by, call it, 10 to 30%, that’s like a huge number,” Anandan told GearTech. “You’re talking about tens of billions of dollars.” This highlights the immense market opportunity for companies that can deliver meaningful improvements in power efficiency.
Validation and Deployment: The Road Ahead
C2i is rapidly moving towards validation and deployment. The startup expects its first two silicon designs to be returned from fabrication between April and June. Following this, they plan to rigorously validate performance with data center operators and hyperscalers who have already expressed interest in reviewing the data. This early engagement with potential customers is crucial for refining the technology and ensuring it meets real-world needs.
The Bengaluru-based startup has assembled a team of approximately 65 engineers and is establishing customer-facing operations in the U.S. and Taiwan to facilitate early deployments and support its growing customer base.
Challenges and Opportunities in a Mature Market
Power delivery is a well-established component of the data center stack, dominated by large, well-funded incumbents with long qualification cycles. Many newer companies focus on optimizing individual components, but C2i’s end-to-end redesign requires simultaneous coordination of silicon, packaging, and system architecture – a capital-intensive and complex undertaking. Few startups attempt this holistic approach, and proving its viability in production environments will take time and effort.
Anandan acknowledges the execution challenges, noting that all startups face technology, market, and team risks. However, he believes C2i’s feedback loop will be relatively short. “We’ll know in the next six months,” he stated, emphasizing the importance of upcoming silicon validation and early customer feedback in proving the company’s thesis.
India's Rising Semiconductor Ecosystem
The investment in C2i also reflects the growing maturity of India’s semiconductor design ecosystem. Anandan draws a parallel to the early days of e-commerce in India, stating, “The way you should look at semiconductors in India is, this is like 2008 e-commerce. It’s just getting started.”
He points to the increasing depth of engineering talent – with a growing proportion of global chip designers based in India – and the government-backed design-linked incentives that have lowered the cost and risk of tape-outs. These factors are making it increasingly feasible for startups to build globally competitive semiconductor products from India, rather than solely operating as captive design centers.
The Future of AI Infrastructure
Whether these favorable conditions will translate into a globally competitive product remains to be seen. The coming months will be critical as C2i validates its system-level power solutions with customers. The success of C2i Semiconductors could not only address the AI power crunch but also solidify India’s position as a key player in the global semiconductor landscape. The demand for efficient power delivery in data centers is only going to increase as AI continues to evolve, making C2i’s mission increasingly vital. The company’s innovative approach, combined with a strong team and strategic investment, positions it well to capitalize on this growing opportunity and reshape the future of AI infrastructure.
- Key Takeaway: Power efficiency is becoming the critical limiting factor in scaling AI data centers.
- C2i's Solution: A plug-and-play "grid-to-GPU" power delivery system designed to reduce energy losses.
- Potential Impact: Significant cost savings, improved GPU utilization, and a more sustainable AI ecosystem.