Curators of Truth: What AI Can’t Do Without Us
- kmichael469
- May 22
- 4 min read
By Jason Kaufman
Let’s start with the obvious. AI can write. Faster than us, more confidently than us, and, on a good day, slicker than we’d like to admit. But ask it to know, really know, and it flounders like a freshman on open-mic night trying to perform Beckett without knowing who Beckett was. It can generate content, yes. But it cannot generate meaning. Not without help.
That's where we come in.
And by we, I mean the tired, sharp, skeptical, quietly heroic knowledge workers. The documentation teams, the content strategists, the tech writers, and the information architects. The ones who’ve spent their lives turning chaos into clarity, and clarity into something a human being can actually use.
In the age of AI, we are no longer just content creators. We are curators of truth. And that’s not a metaphor. That’s the job now.
The AI Hype Hangover
The AI honeymoon is over. We've all seen the demos. We’ve heard the promises. Generate 100 articles in an afternoon. Rewrite the entire knowledge base in one click. Summarize your Slack history into a 12-step program for better collaboration. It sounds like magic. Until it doesn't.
Because speed is not the problem. Scale is not the problem. Trust is the problem.
What happens when your AI tells a customer that a deprecated feature is still supported? When it confidently asserts an outdated procedure? When it makes something up just to fill in a blank?
It doesn’t matter if it was a “pretty good guess.” Your user doesn’t want guesses. They want truth. And they want it now.
What the Machines Are Missing
AI is not creative. Not really. It doesn’t dream. It doesn’t doubt. It doesn’t revise something because it suddenly remembers how its mother used to say it better. It has no conscience, no instinct, no whisper in the back of its mind saying, “Wait, is that actually true?”
And so, it hallucinates. Not because it's broken, but because it has no way to verify. It’s performing improv in a room with no windows.
This is where we come in. We, the curators. The editors. The last line of defense before something untrue becomes permanent.
We teach the machines what to trust. And more importantly, what not to trust.
Prompts Are the New Pages
A lot of folks think of prompts like they think of sticky notes. Temporary. Disposable. Slightly annoying. But prompts are actually interfaces. They are how we program behavior into AI. And just like a good API, a user manual, or a surgical checklist, a well-crafted prompt saves lives. Or at the very least, a lot of time and embarrassment.
But here’s the kicker. Prompts only work when they’re built on a foundation of shared truth.
That’s where the shift happens. Content professionals used to write for people. Now we write for people and machines. Our prompts are part instruction, part persuasion, and part performance review. They need to be clear, contextual, and above all, grounded in trusted knowledge.
TRACI helps (Task, Role, Action, Context, Intent), but even the best prompt fails if the AI is working from garbage data. You don’t give a surgeon a butter knife. You don’t give a writer a dead link. And you definitely do not hand an LLM a pile of Slack messages and call it “source material.”
Knowledge Work Is Now Editorial Work
Let’s get something straight. This isn’t about writing more. It’s about deciding what deserves to be written down at all. What’s still true? What’s actually useful? What do we want the AI to remember?
In this sense, truth curation is editorial work of the highest order. It’s part librarian, part philosopher, part janitor. You're not just capturing knowledge. You're making it useful at scale.
That means:
Verifying facts before they become folklore.
Tagging content with metadata that adds context.
Reviewing prompts as seriously as you’d review an API.
And it means resisting the urge to let AI become a creativity vending machine. Just because it can write doesn’t mean it should.
Institutional Memory Loss
Let’s talk about the real problem nobody wants to admit. People leave. They take everything they know with them. And we’re left with folders of half-written SOPs and SharePoint sites nobody opens.
Generative AI promises to solve that. But only if it’s trained on something real. Something curated. Something human.
Without people actively capturing and curating institutional knowledge, AI becomes a parrot with amnesia. It sounds smart, but it doesn’t remember anything worth knowing.
That’s why curators of truth are indispensable. We don’t just save what’s said. We save what’s meant. We connect the dots between what happened, why it happened, and what should happen next.
Trust Is the New UX
We’ve all heard the UX mantras. Delight your users, reduce friction, make things intuitive. But in an AI-driven workplace, the real UX is about trust.
Can I trust what this assistant just told me?Can I trust that this recommendation is based on current data?Can I trust that the person who wrote this prompt actually understood the context?
Trust isn’t an add-on. It’s the interface. It’s the thing that decides whether AI becomes a trusted teammate or a liability with a keyboard.
And trust comes from us. From the curators.
The New Knowledge Stack
The future of knowledge work doesn’t look like a wiki. It looks like this:
Capture in real time.
Curate with context.
Activate through AI.
It’s a living system. Not a repo.
In this world, prompts are reusable assets. Insights are versioned like code. And content teams are not support staff. They are intelligence architects.
This is what it means to build a knowledge base that teaches the AI to think like your best people.
Final Thought: Teach It What You Know
I used to think writing was about clarity. Then I thought it was about meaning. Now I think it’s about memory.
Because what we write—what we save, verify, and share—becomes the mind of the machine. It becomes the collective memory of the teams we love, the products we build, the customers we serve.
So let’s teach it what we know.
Let’s give it something worth remembering.
Looking forward to stocks. I'll invest.