AI Eats Content—but Where’s It Getting the Calories?
Let’s get something out of the way: AI is not coming to save you from your messy content problem.
I recently wrote a post for the Smarter Consulting blog (Will AI Replace Enterprise Content Management? Not So Fast), but I wanted to expand on the topic over here because I think it’s incredibly important, and I’m not convinced that people understand this point. Yes, AI can summarize, tag, generate, and search at scale. Yes, the latest LLMs are rewriting how we think about knowledge management. And yes, Google, Bing, and the entire search industry are being rebuilt around conversational, AI-first experiences.
But no—AI is not going to replace Enterprise Content Management (ECM). In fact, it makes ECM more critical than ever.
The Tempting Pitch: “AI Will Just Handle It”
We’ve all seen the demo. You type a natural language question—“What are the risks outlined in our Q4 supplier agreements?”—and poof, the AI model generates a beautiful, coherent answer, citing documents buried across four content repositories. I love those demos. I provide those demos to clients regularly.
It’s magical. Or at least, it looks like it is.
That kind of experience is redefining expectations for how people engage with content. Why click through folders and fields when you can just ask?
From that lens, ECM starts to feel… outdated. Why bother with structured metadata, taxonomies, retention policies, or role-based access controls around your largely human-generated content when your chatbot can “just find things”?
It’s a seductive vision: faster answers, fewer humans in the loop, reduced overhead. And it’s not totally wrong—AI is making parts of traditional ECM faster and smarter.
But here’s where the fantasy breaks.
AI Is Not Self-Sustaining
The entire ecosystem of generative AI depends on one critical input: content. Not just any content—quality content. Human-generated, high-context, high-integrity content.
And without it? The whole thing starts to fall apart.
A 2023 research paper from Shumailov et al., titled “The Curse of Recursion”, dropped a brutal truth bomb: when large language models are trained on their own generated content (instead of fresh, real-world human data), their performance degrades—and fast.
“Use of model‑generated content in training causes irreversible defects… the tails of the original content distribution disappear.”
— Shumailov et al., 2023
In plain English: train AI on AI, and it forgets how to be interesting or useful. It loses nuance. It gets bland. It collapses into mediocrity. Here’s a great summary of the paper.
A Nature study from 2024 reinforced this:
“Indiscriminately learning from data produced by other models causes ‘model collapse.’ Models start losing information about the true distribution and fall into narrow, low-variance outputs.”
— Nature, July 2024
After spending some hours (too many hours) reading through discussion boards on the topic of recursion, one person captured my own sentiment with their blunt, but accurate opinion:
“AI consuming its own output … results in increasingly bland, artificial, and often mistaken output.”
So much for self-improving intelligence.
AI Consuming AI = Diminishing Returns
This is the existential crisis nobody wants to talk about: If AI systems are fed too much of their own regurgitated content, they start choking on it.
Call it a feedback loop. Call it entropy. Either way, if we stop producing new, human-authored content—if we stop organizing and maintaining the good stuff—we’re just speeding up the collapse.
Which brings us back to ECM.
Why ECM Isn’t Dead—It’s Indispensable
I’ve been in the information management and collaboration space since the late ’90s. That’s not a flex, and it’s not some kind of turf-defending nostalgia either.
This isn’t protectionism—it’s a reality check.
I feel a bit like the kid in the crowd pointing out that the emperor’s not wearing any clothes. Everyone’s so enamored with the magic of AI, but someone’s got to say it: without real structure, real governance, and real human content, AI falls apart.
So let’s stop pretending ECM is just a legacy drag. Let’s redefine it. Not the SharePoint folder nightmare. Not the 50-page policy doc no one reads. Real ECM is the discipline of making your content:
- Organized – so it can be found.
- Classified – so it’s meaningful.
- Governed – so it’s safe.
- Trusted – so it’s usable.
That’s not bureaucracy. That’s the backbone of anything intelligent your AI is going to do. That’s the scaffolding modern AI systems need to work.
If your internal knowledge base is a disaster—outdated docs, redundant systems, conflicting file versions, no metadata—AI isn’t going to clean it up. It’s going to surface the mess, wrap it in a confident-sounding sentence, and deliver it to your exec team as “truth.”
AI Doesn’t Fix Content Chaos. It Amplifies It.
Garbage in, garbage out? Try: garbage in, synthetic garbage out, repeated at scale.
ECM is what makes your content AI-ready. It ensures your models are feeding on the good stuff: real customer feedback, contract nuances, meeting decisions, regulatory notes—all properly tagged, stored, and accessible.
Without that, your AI is just guessing.
But What About Search?
Let’s talk about what’s really changing. Search is being disrupted. No doubt. Google’s Search Generative Experience, Bing Chat, Perplexity.ai—they’re all showing us a world where answers are synthesized, not fetched. We’re entering a post-search era. It’s not “find a document,” it’s “give me the answer.” That changes everything on the front end.
But not on the back end.
Because those AI answers? They’re still built on top of structured knowledge. You can’t retrieve what you can’t find. And you can’t synthesize what you can’t trust. AI-powered search doesn’t make content organization obsolete—it makes it even more critical. If your content is a mess, your AI search experience will be too.
Roles Are Changing—But They’re Not Going Away
Just like the movement of software on servers to services you consume in the cloud, AI will change many of the roles and responsibilities we perform today, but it will generate as many, or more. ECM professionals aren’t going extinct. They’re evolving. Fast.
- They’re curating the content AI needs.
- They’re designing metadata strategies that actually make AI more useful.
- They’re validating outputs, managing risk, and ensuring compliance.
- They’re the architects of usable, structured knowledge ecosystems.
AI will automate the grunt work—but the strategic layer? That’s growing in importance.
This isn’t the end of ECM. It’s its reinvention.
AI + ECM = Sustainable Intelligence
Years ago, after a user group presentation in Helsinki, I ended up at a pub (#SharePint, anyone?) arguing with an attendee about the state of online content. His stance? There’s too much of it. Low-quality. Redundant. Noise. And it was making search less effective.
My take? We don’t need less content—we need more of the right kind. Not just volume, but structure. Filters. Signal. Context.
I feel the same way about AI.
It’s not that AI doesn’t want more content. It absolutely does. But it also needs better content.
- Clean.
- Contextual.
- Human-made.
- Governed.
And guess what? That’s exactly what modern ECM is supposed to deliver.
So here’s the uncomfortable truth: if you want better AI, you need better content management.
If you don’t clean it, tag it, store it, secure it, and structure it—don’t expect AI to figure it out. Expect hallucinations. Expect compliance failures. Expect bad decisions wrapped in good prose.
AI Is Your Smartest Intern and Your Dumbest Expert
AI is confident, fast, articulate—and completely unaware of its own limitations.
It doesn’t know what’s missing. It doesn’t know when it’s wrong. It doesn’t know what’s sensitive or confidential or off-limits.
You do.
That’s why the work of ECM—the tagging, classifying, cleaning, labeling, retaining—is not busywork. It’s strategy. It’s what keeps your AI honest.
And more than that: it’s what keeps your business smart.
Final Thought: Don’t Let the Hype Kill the Craft
It’s easy to buy into the AI hype cycle. It’s exciting. It’s new. It’s powerful.
But don’t lose sight of this:
AI is only as good as the content you feed it. And that content still needs humans to create it, organize it, and make it matter.
You can throw LLMs on top of chaos and pray. Or you can do the hard work of making your content ecosystem intelligent and sustainable—so your AI can be too.
So no, AI isn’t replacing ECM. It’s making it more important than ever.




