Let me tell you how the AI bubble ends. Not with a whimper, but with hundreds of billions of dollars in planned data centre capacity going up in smoke. Loads of companies going bust. And quite possibly, a proper economic wallop that takes a good chunk of the wider economy down with it.
Sounds dramatic? Maybe. But hear me out.
The Problem With Brute Force
All the big AI tools you’ve probably heard of — ChatGPT, Claude, Gemini — are what’s known as Large Language Models, or LLMs. Despite the fancy name, what they actually do is remarkably simple in concept: they predict what word (or chunk of text) comes next. They do this by having been trained on mind-boggling amounts of data, and the results are genuinely impressive. I use these tools daily, and I’ll be the first to admit they’re brilliant. I get more done, I can explore ideas faster, and it’s honestly quite fun.
But here’s the catch. There’s a mathematical ceiling to this “brute force” approach. You can only throw so much data and computing power at a prediction engine before you hit diminishing returns. And we’re getting close to that threshold — possibly within a year.
The Environmental Elephant in the Room
When a single data centre consumes as much electricity as an entire town, you know something’s off. AI isn’t going to save us from climate change — it’s actively making it worse. These massive facilities also guzzle enormous quantities of water for cooling. Communities near proposed data centre sites are increasingly pushing back, and rightly so. Our planet’s resources aren’t infinite, no matter how clever the software running on those servers might be.
The International Energy Agency has flagged the rapid growth of data centre energy consumption as a serious concern, and it’s only accelerating.
So, Is AI the Devil’s Work?
No. And I think turning your back on it entirely would be a mistake. LLMs do have genuine strengths — they’re good at analysing data, helping with code, drafting content, and loads of other practical tasks. I’m using AI right now to help build websites, and it’s a bit like having a keen junior developer on hand. You tell it what to do, review the output, and iterate. It works really well.
The key is being mindful. You can’t fight progress, so you might as well join in and try to do something good with it. That might not sit right with everyone, but for me, it feels like the right balance.
The Future Is Local
Here’s what I think happens next, once the dust settles from the inevitable crash. The winners won’t be the companies building ever-larger models that need their own power stations to run. The winners will be local models — smaller, focused AI tools that run right on your own computer.
Modern computers are increasingly shipping with something called an NPU, or Neural Processing Unit. It sounds like something out of a sci-fi film, but it’s just a type of chip that’s been designed specifically for AI tasks, rather than general computing or graphics. Think of it as your computer having a specialist brain for AI work, alongside its regular one.
These smaller, local models won’t try to do everything. Instead, they’ll be tuned for specific jobs and do them well — all without sending your data to some distant server farm.
What You Can Do Right Now
If you want to get ahead of the curve before the bubble bursts, here’s my advice. Get yourself a computer that can run local LLMs (most modern machines with an NPU will do). Start experimenting with building your own AI “agents” — essentially, small automated helpers that you teach to do useful things for your specific needs.
But — and this is crucial — be very careful what you connect them to. The technology is still in its early days. You absolutely do not want an AI agent anywhere near your bank accounts, online marketplaces, or anything else where a mistake could cost you real money. It will backfire. Spectacularly.
Real-World Uses That Actually Matter
I’ve got a personal knowledge base in my notes app that I’ve built up over years. It’s, by my own admission, a bit of a mess. A local AI agent could help me organise and search through it — all without any of my data leaving my machine.
Combine a local model with something like an offline copy of Wikipedia or other open data sources, and you can do genuinely useful things. Imagine finding out how many homes in your area have solar panels and battery storage, then reaching out to those neighbours about forming an energy collective. That’s using technology for community resilience — something that actually matters.
The Bottom Line
AI has real utility. But the current gold rush — the trillion-dollar bets on ever-bigger models and ever-thirstier data centres — is heading for a reckoning. When it comes, the smart money will be on small, local, open-source solutions that respect your privacy, your data, and your planet.
Just don’t buy into the hype. And whatever you do, be mindful of the climate impact every time you fire up a frontier model for something you could have done with a quick search instead.
KYAL <3

