Consumers brace for AI’s power bill: three in four fear data centres will hike household energy costs, four in five demand efficiency-first AI, and nearly all want sovereign, homegrown AI systems

BARCELONA, Spain–(BUSINESS WIRE)–#AIFYFN SambaNova, builders of the fastest chip for agentic AI, today released research highlighting the mounting concerns over the energy demands of AI data centres and the impact on households and national power grids. As AI deployment accelerates, business leaders and consumers are aware that legacy, GPU-based infrastructure is not built for the efficiency and scale required in a power‑constrained world.




The survey of 2,525 adults across the US and UK shows that concern about AI’s energy appetite is no longer abstract. Awareness of AI data centres electricity usage is widespread, and consumers are drawing a direct line between the infrastructure choices providers make and their own monthly bills.

Key findings from the AI Energy Survey

  • 75% of respondents fear AI data centres could lead to higher household energy bills in their area.
  • 75% say they are aware of the significant electricity consumption associated with AI data centres.
  • 83% believe AI companies should prioritize energy efficiency, even if it slows the rollout of new AI capabilities.
  • 71% agree AI data centres will strain their country’s power grid.
  • 91% say it is important that their country has its own AI systems.
  • Full findings can be found here.

The findings echo SambaNova’s 2024 AI Leadership Survey, which exposed a readiness gap inside enterprises as AI deployments surged. One year ago, 49.8% of business leaders were concerned about AI’s energy and efficiency challenges, yet only 13% monitored the power consumption of their AI systems. Today, concern has moved beyond the data centre floor and into living rooms: while leaders still struggle to measure AI power usage, three in four consumers worry AI infrastructure will raise their household bills and strain national grids.

AI data centres must be engines of efficient growth

“The findings reveal a new reality: AI is no longer just an enterprise technology story – it is an infrastructure story that reaches all the way to consumers’ electricity bills,” said Rodrigo Liang, CEO and co‑founder of SambaNova. “Data centres are the growth engine of AI, but if they are built on inefficient hardware, that growth will come with unacceptable power and cost trade‑offs.”

“People want powerful, always‑on AI – but they also want providers to keep grids stable and energy costs under control,” Liang continued. “This is why we’re focused on building efficient systems that dramatically increase tokens‑per‑second and throughput per rack, without blowing past standard power envelopes.”

Liang added: “With our new SN50-based systems, customers can stand up high‑density AI data centres that run fleets of intelligent agents in real time while staying within 20 kW per rack and using standard air cooling – no exotic power or cooling retrofits required. The SN50 chip delivers up to 5x more compute per accelerator and as much as three times better inference efficiency than leading GPU-based systems, enabling operators to scale AI services faster, serve larger models and longer context, and still reduce total cost of ownership. This is how we turn AI data centres into efficient, high‑growth infrastructure for the next decade, instead of a drag on national power systems.”

Underestimating AI’s power implications

Last year’s AI Leadership Survey showed enterprises were racing ahead with AI adoption while underestimating its power implications: “Last year, we projected that by 2027 more than 90% of leaders would be concerned about AI’s power demands and would monitor consumption as a board‑level KPI,” stated Liang. “This new data suggests the inflexion point may arrive faster than expected: with three‑quarters of consumers worried about AI’s impact on their bills and 83% explicitly calling for energy‑efficient AI.”

SN50 Chip: Built for power‑constrained, AI‑first data centres

SambaNova’s fifth‑generation SN50 RDU chip is purpose‑built for fast inference and agentic workloads in modern AI data centres. Each 20 kW SambaRack SN50 integrates 16 SN50 processors, and up to 16 racks can be interconnected to support 256 accelerators over a multi‑terabyte‑per‑second fabric, enabling customers to deploy very large models with longer context while maintaining high throughput and low latency.

For data centre operators facing tightening power budgets and surging AI demand, SN50 turns existing facilities into high‑density AI zones, allowing them to expand capacity quickly inside current power and cooling envelopes – exactly the kind of infrastructure shift consumers are now demanding.

About SambaNova

SambaNova is a leader in next‑generation AI infrastructure, providing a full stack platform that powers the fastest, most efficient AI inference for enterprises, NeoClouds, AI labs and service providers, and sovereign AI initiatives worldwide. Founded in 2017 and headquartered in San Jose, Calif., SambaNova delivers chips, systems and cloud services that enable customers to deploy state‑of‑the‑art models with superior performance, lower total cost of ownership and rapid time to value.

For more information, visit sambanova.ai or follow SambaNova on X and LinkedIn.

About the survey

This survey was commissioned by SambaNova on AI’s energy requirements and conducted in December 2025 with 2,525 consumers across the US and UK.

Contacts

Press Contact:
Virginia Jamieson

virginia.jamieson@sambanova.ai
650-279-8619

Recent Posts
Contact Us

Not readable? Change text. captcha txt