When “Good Enough” Is Good Enough: The Rise of Intelligence Saturation in AI.
I was recently at an AI meetup event and got to talking with someone about how we use popular large language models like ChatGPT and Perplexity in our day-to-day work. We both noticed something curious: for many of our non-technical tasks, the value we get from each new, more advanced model seems to be plateauing. The latest upgrades promise smarter, faster, more ‘human’ intelligence. But for the non-technical work we actually do, we’re not seeing the leaps we used to. The improvements are getting smaller-and sometimes, they’re not noticeable at all. In fact, switching to an older model when I run out of tokens is almost imperceptible.
This isn’t just a passing observation. There’s a term for it: intelligence saturation. Let’s unpack what that means and why it matters.
Intelligence saturation is the idea that, for certain tasks, AI has gotten so good that making it even smarter doesn’t move the needle. Think of it like pouring water into a glass that’s already full; no matter how much more you add, it just spills over the edge. For many specific tasks we ask of these models, the glass is already full.
But there’s a big difference between tasks and jobs. A task might be ‘summarize this email thread’ or ‘draft a meeting agenda.’ A job is the whole package: managing a project, guiding a team, keeping long-term goals in sight. Doing a job well requires not just intelligence, but intent - the ability to hold onto a purpose and direction over weeks, months, or even years.
Today’s AI is saturating at the task level. It’s not even close to saturating at the job level. The ability to maintain intent over time, to adapt as circumstances shift, to remember why you started and where you’re going is still a uniquely human skill. Even the most optimistic estimates, not that estimates have been particularly reliable in this field, suggest that AI Agents might be able to maintain their “intent” for a week or two, not the years that most of us spend in our roles. While it’s true that few professional projects truly last years without a reset of intent, it’s no less true that a couple of weeks wouldn’t be sufficient. These numbers are changing rapidly, so pls don’t hold me to my predictions. ;)
What does this mean for the evolution of AI and AI tools? First, we can’t ignore that with the high cost of compute and the energy it consumes, model and tool makers will be eager to right-size their tools and models. Just like in the old adage of driving a Ferrari to the grocery store to buy a gallon of milk, there is little reason to use an overpowered model or tool to complete a task that can be done just as well with a lower-powered one.
Another thing that comes to mind is that at some point, and I believe it’s already happening, intelligence is going to become commoditized. You can test this yourself, run the same prompts through a couple of different models, and see if the outputs are materially different. My experience has shown that they are not. At least, not different enough for most tasks I need done. If every tool can summarize, draft, and organize with the same proficiency, the differentiator isn’t the intelligence; it’s user experience. A big part of the user experience is how well the steps fit into one’s workflow. Tools that fit seamlessly into our daily routines have a real advantage. Who hasn’t been frustrated by online forms that won’t let you paste in copied info, or by having to copy and paste an address into Google Maps instead of just clicking on it? (I’m looking at you, iPhone!)
Right now, there’s still a lot of overhead in using AI tools. Most of us are juggling multiple tabs, copying and pasting, and managing a patchwork of outputs. That’s not sustainable, and it’s a clear opportunity for innovation. The companies and app builders who focus on workflow, not just intelligence, will pull ahead.
As someone who works with and in AI, I see intelligence saturation as both a challenge and an opportunity. The challenge is that the old playbook of just making the model smarter won’t cut it anymore. The opportunity is that the real value now lies in design, integration, and user experience. If you’re building or implementing AI, ask yourself: Are you chasing marginal gains in intelligence, or are you making it radically easier for people to get their work done? I believe that the future belongs to those who focus on the latter.
And if you’re feeling like the latest AI model isn’t changing your world, you’re not alone. It just means you’re already at the frontier of what these tools can do for your tasks. The next leap won’t be in intelligence, it’ll be in how that intelligence is delivered.
If you’ve noticed the same thing in your own work, I’d love to hear about it. Where do you see intelligence saturating, and where do you still feel the gap? Let’s keep the conversation going.
If you're ready to bring the power of AI into your product management, project management, or delivery processes—or if tackling AI governance feels like a mountain you need to climb—I’m here to help. Reach out to me today or follow me here for actionable insights and strategies to harness AI effectively and efficiently.