Niv-AI exits stealth to wring more power performance out of GPUs
AI Data Centers Waste Billions in Unused Power—This Startup Aims to Fix It
In the high-stakes world of artificial intelligence, electricity isn’t just a utility—it’s a precious raw material. Yet, as AI models grow increasingly complex and data centers expand to meet demand, a staggering amount of this vital resource is being squandered. Industry leaders are sounding the alarm, and now a new Israeli startup is stepping up with a bold solution.
During Nvidia’s annual GTC conference, CEO Jensen Huang delivered a stark message: “There is so much power squandered in these AI factories.” The company’s rallying cry was equally blunt: “Every unused watt is revenue lost.” With AI infrastructure costing billions and energy prices soaring, the pressure to optimize power usage has never been greater.
Enter Niv-AI, a Tel Aviv-based startup that has just emerged from stealth mode with $12 million in seed funding. Founded last year by CEO Tomer Timor and CTO Edward Kizis, Niv-AI is on a mission to revolutionize how data centers manage their energy consumption. The company’s backers include Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. While Niv-AI declined to disclose its valuation, the caliber of its investors speaks volumes about the potential of its technology.
The problem Niv-AI is tackling is both complex and costly. As frontier AI labs operate thousands of GPUs in concert to train and run advanced models, they face frequent, millisecond-scale power demand surges. These surges occur as processors switch between computation tasks and communicate with other GPUs. For data center operators, this creates a nightmare scenario: they must either invest in expensive temporary energy storage to cover these surges or throttle GPU usage, both of which reduce the return on their massive investments in cutting-edge chips.
Lior Handelsman, a partner at Grove Ventures and board member at Niv-AI, puts it bluntly: “We just can’t continue building data centers the way we build them now.” The traditional approach is no longer sustainable, either economically or environmentally.
Niv-AI’s solution begins with understanding the problem at a granular level. The company is deploying rack-level sensors that detect power usage at the millisecond level on GPUs it owns and through design partners. The goal is to map the specific power profiles of different deep learning tasks and develop mitigation techniques that allow data centers to unlock more of their existing capacity.
But Niv-AI isn’t stopping there. The team expects to build an AI model on the data they collect, training it to predict and synchronize power loads across the data center. They envision this as a “copilot” for data center engineers, providing real-time insights and recommendations to optimize energy usage.
The startup’s ambitions extend beyond individual data centers. Timor explains, “The grid is actually afraid of the data center consuming too much power at a specific time.” Niv-AI sees its ultimate product as a missing “intelligence layer” between data centers and the electrical grid. By creating more responsible power profiles, they aim to help data centers utilize more GPUs and make better use of the power they’re already paying for, while also easing the burden on the grid.
The timing for Niv-AI’s solution couldn’t be better. As hyperscalers race to build new data centers, they’re facing significant land-use and supply chain challenges. Being able to squeeze more performance out of existing infrastructure is not just attractive—it’s essential.
Niv-AI expects to have an operational system in several U.S. data centers within the next six to eight months. If successful, their technology could mark a turning point in the AI industry, transforming data centers from power-hungry behemoths into models of efficiency.
As the AI revolution accelerates, the battle for energy efficiency is becoming just as important as the race for algorithmic supremacy. With Niv-AI’s innovative approach, the industry may finally have a way to turn wasted watts into valuable compute power, ensuring that the future of AI is not just smart, but also sustainable.
Tags: AI, Data Centers, Energy Efficiency, Power Management, Nvidia, GPU, Machine Learning, Sustainability, Tech Startups, Electrical Grid, Deep Learning, Hyperscalers, Innovation
Viral Phrases:
- “Every unused watt is revenue lost”
- “AI factories” wasting power
- The battle for energy efficiency in AI
- Turning wasted watts into valuable compute power
- A “copilot” for data center engineers
- The missing “intelligence layer” between data centers and the grid
- Sustainable AI: Smart tech meets energy efficiency
- Revolutionizing how data centers manage their energy consumption
- The future of AI is not just smart, but also sustainable
- From power-hungry behemoths to models of efficiency
,




Leave a Reply
Want to join the discussion?Feel free to contribute!