DealForge autonomously sources, scores, and writes investment memos on venture deals. Stop manually hunting.

1,180+ deals tracked  ·  22 AI investment memos  ·  Updated daily

← Back to leaderboard

LLMs consume 5.4x less mobile energy than ad

Show HN: LLMs consume 5.4x less mobile energy than ad-supported web search

44 AI Score
Show_hn other Added Apr 25, 2026

Details

Sector
other
Total Funding
$0
Last Round
$0

About

The standard AI energy debate compares server-side LLM inference to a server-side Google query. I think this misses most of what actually happens on a mobile device during a real search session.<p>I built a parametric model of the full end-to-end mobile search session: 4G&#x2F;5G radio energy, SoC rendering cost for a 2.5MB page, programmatic advertising RTB auctions running in the background, and network transmission costs for both sides. Then compared it to an equivalent LLM session.<p>Main finding across 10,000 Monte Carlo draws: on mobile, a standard LLM session uses on average 5.4x less energy than a classic ad-supported web search session. Programmatic advertising alone accounts for up to 41% of device battery drain per session.<p>Caveats I tried to be explicit about:<p>- Advantage disappears on fixed Wi-Fi&#x2F;fiber<p>- Reverses for reasoning models<p>- Parametric model, not empirical device measurement. Greenspector has offered to run terminal measurements for v2<p>- Jevons paradox applies<p>SSRN working paper, not peer-reviewed. Methodology and Monte Carlo distributions fully documented in the paper. Happy to defend the assumptions.<p>DOI: 10.2139&#x2F;ssrn.6287918

AI Score Reasoning

This is currently a research-driven thesis rather than a commercial startup, offering a compelling counter-narrative to AI energy consumption that could pivot mobile search strategies. While the technical insight is high-quality and timely, the lack of a formal product, business model, or team makes it a highly speculative 'pre-seed' research project.

Source

Show_hn — View original →