GPT-5, Gigawatts, and the Future of AI
The prefix “giga” comes from Greek and means billion (that's nine zeros) This week, I caught an insightful chat between Sam Altman, CEO of OpenAI, and Cleo Abram. The topic? GPT-5—the smarter, sleeker sibling of GPT-4o. Redditors are already complaining that GPT-5 feels “dry” (some were apparently attached to GPT-4o—lol), but in the AI world models change is the only constant, and these models are evolving fast! The Three Pillars Holding AI Back Altman says AI isn’t just about clever code—it’s about Data, Algorithmic Design, and Compute. Here’s the breakdown: 1. Data: Knowing Everything (But Not Really) GPT-5 has practically ingested all the content you can find online. It’s a walking encyclopedia with a dash of personality. But don’t be fooled—AI still struggles with the “1000-hour problems” while breezing through the 1-minute stuff. Creativity? That’s still our turf! AI can summarize, predict, and analyze, but it hasn’t yet learned how to imagine something that never e...