Your Business Has a Stack. Your AI Should Too.

Here’s a thought that’s been running through my head as I read more and more on the future of AI in business, and how the cyclical nature of technology seems to gravitate back and forth between best-of-breed and centralized solutions. It feels like we’re shifting back toward best-of-breed once more.

The AI stackEvery modern business runs on a tech stack. You’ve got front-end frameworks talking to back-end services, APIs linking to databases, and data pipelines feeding dashboards. They are all modular, all designed to scale. This is how software gets built: in layers, with each component doing its job, cleanly separated and composable.

So why are we treating AI any differently?

Right now, a lot of businesses are using AI like a monolith, a single model slapped onto a problem, often through a basic chatbot interface. It might look cool for a minute, but it won’t last. It won’t scale. And it definitely won’t adapt. It’s the same pattern we’ve seen before — centralized, one-size-fits-all platforms promise simplicity but hit walls fast.

To do AI right, in other words, to make it useful, reliable, and deeply integrated into your business, you need an AI stack. One that mirrors how software has evolved: layered, modular, composable. In short: best-of-breed.

Software Has Layers. So Should AI.

In software development, stacks are powerful because they separate concerns. The database doesn’t care what the user interface looks like. The front-end doesn’t need to know how data gets processed. Layers abstract complexity and allow for independent development, testing, and scaling.

The same thinking applies to AI, especially as the ecosystem matures. You shouldn’t rely on one model to do everything. Instead, think in terms of layers:

  • Retrieval Layer: Where relevant knowledge is pulled in real-time.
  • Model Layer: Where foundational or fine-tuned models do the reasoning.
  • Agent Layer: Where decision-making and task orchestration happens.
  • UI Layer: Where users interact with AI outputs — chatbots, dashboards, apps.

Each of these layers can evolve independently. You can swap models, upgrade tools, or reconfigure logic without blowing everything up.

The Retrieval Layer: Context is King

LLMs are powerful, but out of the box, they don’t know your business. They don’t know your product catalog, your customer support history, or your SOPs (standard operating procedures). That’s where retrieval comes in.

The retrieval layer bridges the model and your proprietary knowledge. It pulls the most relevant chunks of internal data, such as documents, databases, meeting transcripts, and feeds them into the model’s context window. Tools like vector databases, RAG (retrieval-augmented generation), and embeddings live here.

This is your knowledge infrastructure. Treat it like a backend service that is reliable, queryable, and constantly updated.

The Model Layer: The Engine Room

This is where the magic happens: inference, generation, summarization, code writing, reasoning. But even here, don’t go all-in on a single model. Use a best-of-breed mindset.

Use a lightweight model (like Mistral or Phi) for fast tasks. Pull in a larger model (like GPT-4 or Claude) for heavy lifting. Fine-tune smaller models for domain-specific tasks. Chain models together if needed.

Your model layer should be flexible. It should use the right tool for the job — just like a microservices architecture uses different services for different workloads.

The Agent Layer: Brains with Structure

Think of agents as the application logic of AI.

This is where reasoning, memory, tools, and task orchestration come together. It’s what lets your AI not just generate content, but follow instructions, break down goals, and take actions across systems. LangChain, CrewAI, and AutoGen are examples of frameworks that power this layer.

Good agents know when to pull from the retrieval layer, which model to use, how to invoke APIs, and how to keep context across interactions. They’re the glue between raw capability and business value.

With a best-of-breed approach, your agent can integrate specialized tools, swap components, and scale logic across domains.

The UI Layer: Don’t Just Chat. Embed.

Not every AI experience needs to look like ChatGPT. In fact, most shouldn’t.

Your AI layer should surface in the tools your teams already use: CRMs, email clients, internal wikis, analytics dashboards. Sometimes it’s a chatbot. Sometimes it’s an auto-summarizing UI component. Sometimes it’s just an AI-powered search box.

The point is: AI should feel native, not bolted on. The UI layer is where AI becomes part of how people actually work.

Why This Stack Matters

Let’s be clear: building an AI stack takes more thought than spinning up a single chatbot. But the payoff is real.

  • Adaptability: Swap models, upgrade frameworks, change interfaces — without rewriting everything.
  • Scalability: Handle more users, data, and complexity without bottlenecks.
  • Security: Keep sensitive data in the retrieval layer instead of piping it into external APIs.
  • Alignment: Build AI that actually knows your business, reflects your values, and fits your workflows.

Composable AI stacks don’t just work better; they align with where the industry is heading.

We’ve Been Here Before

Software has gone through this cycle before: from best-of-breed tools to bloated monoliths, and back again to composable systems. AI is following the same arc.

We’re swinging away from centralized, “one model to rule them all” thinking. The businesses that embrace modularity and build AI-like infrastructure, not a feature, will move faster, experiment more, and integrate AI where it matters.

Your business already runs on a stack. Your AI should too.

Christian Buckley

Christian is a Microsoft Regional Director and M365 MVP (focused on SharePoint, Teams, and Copilot), and an award-winning product marketer and technology evangelist, based in Dallas, Texas. He is a startup advisor and investor, and an independent consultant providing fractional marketing and channel development services for Microsoft partners. He hosts the #CollabTalk Podcast, #ProjectFailureFiles series, Guardians of M365 Governance (#GoM365gov) series, and the Microsoft 365 Ask-Me-Anything (#M365AMA) series.