GPT-5, Gigawatts, and the Future of AI
This week, I caught an insightful chat between Sam Altman, CEO of OpenAI, and Cleo Abram.
The topic? GPT-5—the smarter, sleeker sibling of GPT-4o. Redditors are already complaining that GPT-5 feels “dry” (some were apparently attached to GPT-4o—lol), but in the AI world models change is the only constant, and these models are evolving fast!
The Three Pillars Holding AI Back
Altman says AI isn’t just about clever code—it’s about Data, Algorithmic Design, and Compute. Here’s the breakdown:
1. Data: Knowing Everything (But Not Really)
GPT-5 has practically ingested all the content you can find online. It’s a walking encyclopedia with a dash of personality. But don’t be fooled—AI still struggles with the “1000-hour problems” while breezing through the 1-minute stuff.
Creativity? That’s still our turf! AI can summarize, predict, and analyze, but it hasn’t yet learned how to imagine something that never existed.
As Sam put it, humans are still the ones required to come up with new ideas!
2. Algorithmic Design: Teaching AI to Think Like a Genius
GPT-5 is better at reasoning through problems than ever before. Altman even hints it might one day redesign itself—imagine software that evolves while you sleep.
This is what people mean when they talk about the road to Artificial General Intelligence (AGI). The smarter the design, the less supervision AI needs. The more autonomy it gains, the closer we get to machines that aren’t just tools—they’re collaborators.
3. Compute: The Gigawatt Challenge
GPT-5 doesn’t run on your laptop—it runs on an army of chips that gulp gigawatts of energy. 1 gigawatt can power about 750,000 homes.
Why is this so hard? Every little interaction—even someone saying “Thank you”—costs OpenAI real money. Running GPT-5 takes efficient chips, advanced cooling, and massive infrastructure.
That’s partly why Sam Altman invested in Oklo, a startup building compact nuclear reactors that could supply the clean, gigawatt-scale power AI needs to keep growing.
Beyond GPT-5: Where We’re Headed
Altman is clear: GPT-5 is impressive, but it’s not AGI yet. Future models might learn in real-time, adapt continuously, and tackle problems we haven’t even imagined. Medicine, climate science, energy—they all stand to benefit.
The tricky part? It’s impossible to predict exactly where AI will be in the next 10 years. The pace of change is blistering, and breakthroughs can happen suddenly. Society needs to be ready for shifts in jobs, workflows, and even the way we think. In a world this fast-moving, adaptability isn’t just a skill—it’s survival.
Comments
Post a Comment