Ops Command Center v3.2.1
KB-WA-2026 Ready
Created May 16, 2026

What a Real AI Roadmap Session Looks Like (And What You'll Walk Away With)

What actually happens in an AI roadmap session. The real agenda, the questions that always come up, what the team walks away with, and what happens in week 2.

manufacturing
Strategy
strategy operations systems
Tags:
#AI #manufacturing #roadmap #strategy #implementation #leadership
Document Content

People ask me what the roadmap session actually looks like. I think they expect a slide deck and a facilitated workshop where we put sticky notes on a wall and vote on priorities.

That’s not it.

It’s closer to: I spend a day embedded with your leadership team and a handful of your operators, and by the end of that day you have a clear picture of what AI is actually useful for in your specific operation, what the architecture needs to look like, which project should go first, and who should own it. Oh, and we usually build something live.

Let me walk you through what a real session looks like.

Why I Abandoned the Standard Deck

I used to run more structured sessions. Prepared materials, specific agenda, pre-defined outputs. That structure broke down within 20 minutes every time, because every operation is different and the conversations that matter can’t be scheduled.

The estimator who casually mentions that she processes 17 different supplier PDF formats before she can start the actual estimate — that’s worth an hour. The scheduler who’s building a Python tool in his spare time to handle cell-balance optimization — that’s worth stopping and looking at. The floor supervisor who immediately proposes a specific workflow for the daily huddle AI assistant when you describe the concept — follow that.

The best sessions I run are conversations that go somewhere unexpected and produce real artifacts by the end. The sessions that produce the best decks produce the worst outcomes.

A roadmap session that produces a polished deliverable and no one who’s furious to build something is a waste of a day. The measure of success is momentum, not documentation.

The Actual Session Flow

Here’s what typically happens, in rough order:

PhaseDurationWhat Happens
Context and grounding60-90 minI learn how the operation actually works. Floor tour, if possible. I ask about the metrics that matter, the competitive pressure, the problems that keep the CEO up at night.
AI fundamentals45-60 minNot a lecture — a conversation. I cover the concepts that are most relevant to their operation. Skills, agents, commands. The L1-L4 framework. The three-tier security model. Concrete examples for each.
Live demonstration60-90 minI build something in the room. Typically a skill based on a document they hand me — an NDA, a quality policy, a work instruction. They watch it go from zero to working. This is usually the moment the room shifts.
Pain-point mapping90-120 minEach department head walks through their highest-friction workflows. We map each pain point to an AI workflow type, an appropriate capability level (L1-L4), and an initial complexity estimate.
Architecture discussion45-60 minData classification, security tiers, tooling decisions, infrastructure questions. Usually David’s territory in the room — whoever is the IT or operations lead.
Ambassador identification30 minEach department head identifies their ambassador candidate. We discuss the 30-day model and what a good first assignment looks like.
Roadmap prioritization30-45 minWe take the pain-point list and make preliminary sequencing decisions: what goes first (highest impact, lowest complexity, best ambassador match), what goes second, what’s a six-month problem.

The total is usually 6-8 hours, sometimes with a lunch break in the middle. It’s a long day. It needs to be — the important conversations happen in the second half when the trust is built and the team is willing to be honest about what’s actually broken.

The Questions That Always Come Up

After enough of these sessions, I can predict the questions before they’re asked.

“Is everyone behind on this? Are we behind?” No. AI adoption in real operations is below 10%. The companies you’re reading about in the trade press are not representative of where mid-market manufacturing actually is. You have time, but not unlimited time.

“Where does the data go?” Every time. This is the right question. The answer is the three-tier architecture — local for sensitive, Microsoft enclave for business-sensitive, scrubbed external for low-sensitivity. I spend real time on this because “I’ll explain it later” is not an acceptable answer to a question about customer IP.

“What does this actually cost?” Also every time, usually from the financial lead. Token economics, model selection, per-seat budgets — I walk through the real math. Because “it’s cheap” is not helpful and “it’s expensive” is a blocker. The actual answer depends on your volume and your use case mix.

“What happens to people?” This one carries more weight than the others. The honest answer: they move up. The tasks that don’t require their expertise get handled by AI. They spend their time on the work that requires human judgment, creativity, and relationships. The manufacturers who handle this well are explicit about the narrative — “we’re giving you tools, not taking your job” — and then they prove it by giving people back time and not reducing headcount.

“How long until it works?” The first win can happen in a week. A production-quality first workflow can be running in 30 days. A full first-wave deployment (3-5 workflows across multiple departments) typically takes 90 days. Planning to have “the AI system” done in 90 days is the wrong frame — planning to have three specific workflows running in 90 days is the right frame.

The Live Build Moment

This is the part I don’t want to skip past.

In a recent session with a medical device manufacturer, I built an NDA review skill from scratch during the session. Took the company’s standard NDA, gave the AI a set of review instructions, and within 15 minutes we had a working skill that could analyze an NDA, flag non-standard clauses, and produce a structured summary for legal review.

Then I chained a second skill: a rewrite skill that could take the flagged clauses and draft alternative language. Both were operational in under 30 minutes total.

The energy in the room after that was different. There’s an abstract version of “AI can do this” and there’s a concrete version where the thing is running in front of you, on your document, producing output you recognize. The second version changes the conversation from “should we do this?” to “how fast can we do this?”

That moment is why the live build is non-negotiable in every session.

👉 Tip: If you’re running an AI session with your team and there’s no live demonstration of something being built, you’re doing an AI presentation, not an AI session. The difference matters.

What You Walk Away With

At the end of the session, the team has:

  1. A pain-point map — every friction point surfaced by department, mapped to an AI workflow type and a capability level.

  2. A prioritized roadmap — three to five workflows selected for the first wave, based on impact, complexity, and ambassador availability.

  3. Named ambassadors — one per department, with a clear 30-day assignment.

  4. An architecture decision framework — data classification, security tier mapping, tooling decisions.

  5. A live example — something built during the session that demonstrates the actual capability, not the theoretical capability.

  6. A follow-up process — each department head sends their detailed pain-point list; I return a mapped workflow recommendation for each item within two weeks.

What the team does not walk away with: a 50-slide strategy deck, an implementation timeline that extends 18 months into the future, or a vendor recommendation that makes someone else responsible for making this work.

What Happens in Week 2

The real work starts after the session. The ambassadors are briefed by their department heads. They start working through their 30-day assignments. The pain-point lists come in. I review and map them.

Then there’s a working session — smaller group, more technical — to start actually building the first workflow. This is where the rubber hits the road and where the difference between companies that ship things and companies that plan things becomes visible.

The companies that move fast in week 2 are the ones that had a genuine ambassador in the room during the roadmap session. The companies that slow down in week 2 are the ones where the roadmap session was attended by the people who approved it but not the people who will build it.

👉 Tip: Don’t run the roadmap session for the executive team only. The executives need to be in the room. But so do the operators, the schedulers, the estimators — the people who will actually use these tools. Their buy-in isn’t just nice to have; it’s what makes week 2 work.

🔧 Tool: After the session, create a shared tracking document: workflow name, ambassador owner, target go-live date, current status. Review it in every subsequent check-in. This is not a project management platform — it’s a 1-page table that makes progress visible.

That’s what a real roadmap session looks like. No sticky notes. No theoretical frameworks. Just a clear picture of your operation, your opportunities, and who’s going to make them real.


The next step is the session itself. Let’s schedule it.

Back to Knowledge Base
Need help implementing these concepts? Submit Work Order

Related Reading