Someone told me to watch “Star Trek Academy” to wind down. I rarely watch TV and I needed a break. What was supposed to be a quiet evening turned into a full-on whiteboard session in my head.
Because as I watched, my nerdy mind kept screaming one thing: SAM is every “Talk to Your Data” agent being promised to the enterprise. And the “trauma” is the decades of ugly, uncleaned, undocumented, and uncataloged data warehouses we are forcing them to process without a single day of experience!
Yeah... even when I try to force myself to wind down, my brain only thinks about work.
So who is SAM?
A 32nd-century holographic cadet. The first of her kind. SAM was built from two centuries of processing and she was activated with the knowledge of an 18-year-old. She has the vocabulary, the facts, and the cognitive ability of any cadet. Her creators wanted her to be unbiased, so they gave her no past, no memories, no time to grow up. And after 209 days of absorbing trauma and heavy human emotions “living on earth” she glitched. Not because she’s not intelligent enough but because she never lived through the 0 to 18. She lacked resilience. The one that you learn through your childhood, walking your first steps and falling.
SAM was given the destination without the journey. And without the resilience that only comes from years of small failures and slow corrections, the first real weight she carries destroys her.
And of course, I kept wondering if that’s what we’re doing with AI in data and analytics.
SAM: The Semantic Layer
Semantic layers were one of the most important things to happen to enterprise data. Getting an entire organization to agree that “revenue” is calculated this way and “churn” means that is hard but necessary work. And it solved a real problem: consistency. But semantic layers were designed to serve human analysts who already brought their own experience to the table. Now we’re putting AI agents on top of that same layer and saying: go be an analyst.
That’s SAM at 18. It has the vocabulary. It has the definitions. But it has never been wrong. It’s never traced a broken join through three undocumented tables because the numbers didn’t add up. It’s never had a client say “market basket” or “grace period” and learned after three rounds of miscommunication that they meant something completely different. It’s never looked at a technically correct result and felt in its gut that something was off.
Your best analyst didn’t become your best analyst by reading a data dictionary. They became your best analyst by surviving messy data for years. By building resilience one mistake, one rough client feedback, one correction, one dead end at a time.
209 days of trauma: the enterprise data problem
I had to build a data product from EHR data once. If you’ve worked with EHRs, you know. And this isn’t just healthcare data. It’s everywhere.
Most enterprise data warehouses are full of it. Definitions that mean one thing in finance and something else in ops. Tables that should have been deprecated three years ago but somehow still power a dashboard someone’s VP checks every Monday. Joins that work most of the time and quietly produce garbage the rest. Business rules that live in one person’s head because they never got around to writing them down.
And we’re pointing an LLM at all of that. Raw. All at once. No framework. No memory of being burned before. No resilience.
Of course it breaks. Of course it hallucinates. It gives you a confident answer that’s wrong and it doesn’t even know it’s wrong — because it has no mechanism for that gut feeling of “wait, something’s off here.” That instinct doesn’t come from definitions. It comes from years of getting burned.
That’s SAM processing 209 days of trauma without ever learning how to cope with a bad day.
The 0 to 18: Can’t that be a layer?
We’re doing the same thing to our AI agents. Every fact in the data warehouse. Not a single memory of what to do with it.
So how do we turn decades of analyst experience into something AI can actually use? The patterns. The sequences. The “we tried that, it doesn’t work.” The “when you see this, always check that first.”
Some call it a context layer. Honestly, I’m not even sure “layer” is the right word. A layer just sits there. And resilience doesn’t just sit there. It’s active. It’s the thing that kicks in when something doesn’t add up — the thing that says slow down, check this, you’ve seen this before.
Maybe it’s more of an engine. Something that processes your organization’s analytical history — the mistakes, the corrections, the dead ends, the breakthroughs — and encodes it in a way that AI can actually learn from. Not just definitions. Biography.
I’m still working through it.
But I keep thinking about SAM. They gave her everything except the one thing that mattered — the time to grow up. And I think that’s exactly what we’re skipping when we hand an AI agent a semantic layer and call it ready.
SAM needed the 0 to 18. So does your AI.
So much for winding down. I don’t know if I like Star Trek Academy BUT I’m a big fan of SAM!
