DailyGlimpse

AI Chatbots Found to Use Far Less Mobile Energy Than Ad-Supported Web Searches

AI
April 28, 2026 · 11:16 AM

A recent analysis suggests that large language models (LLMs) consume 5.4 times less energy on mobile devices compared to traditional ad-supported web searches. The finding challenges common assumptions about the environmental impact of AI, highlighting that the energy cost of loading and displaying ads, trackers, and heavy web pages may outweigh the computational demands of generative AI responses.

The comparison focuses on mobile energy usage, where typical web searches involve multiple redirects, ads, and scripts that drain battery life. In contrast, LLMs process queries and generate concise answers in a single exchange, reducing overall data transfer and processing steps.

While the study did not account for the energy used in training LLMs—a known high-cost phase—it underscores the potential efficiency gains in deployment. As AI models become more embedded in everyday tools, this energy differential could influence design choices for eco-conscious developers and users.