Domain Expertise in the Age of AI

Why knowing the business matters more than ever — and how AI is changing who gets to know it

ArticleAugust 28, 202510-minute read

AI has made knowledge radically cheaper. A developer can now ask an LLM to explain insurance adjudication, parse a regulatory filing, or summarize a 200-page procurement standard — and get a useful answer in seconds. But “useful” is not the same as “correct.” And the gap between the two is exactly where domain expertise lives.

The collective knowledge embedded in large language models is extraordinary. It spans law, medicine, finance, logistics, manufacturing, and dozens of other fields. Product owners, business analysts, and developers now have access to a knowledge base that would have required a shelf of textbooks and years of experience to accumulate. This is genuinely transformative.

It is also genuinely dangerous — if you do not know what questions to ask. An LLM will confidently answer a shallow question with a shallow answer that looks plausible but misses the edge cases that matter in production. The people who know which questions to ask — the subject matter experts, the experienced BAs, the developers who have lived inside a domain — are not being replaced by AI. They are being amplified by it.

The paradox

AI makes knowledge more accessible and domain expertise more valuable — at the same time.

Consider what it used to cost to understand a new business domain deeply enough to build software for it. A developer joining a healthcare project would spend weeks reading documentation, shadowing clinicians, and sitting in requirements sessions before they could meaningfully contribute. A business analyst entering the compliance space would need months to internalize the relevant regulations.

AI compresses that timeline dramatically. An LLM can summarize HIPAA privacy rules, explain the difference between ICD-10 and CPT coding systems, or walk through the lifecycle of a customs declaration — in minutes. Laws can be parsed and cross-referenced. Procedures and business rules can be swept through and mapped. Technical standards that would take days to read can be distilled into actionable summaries.

This is not a marginal improvement. It is a structural change in who can access domain knowledge and how quickly they can get productive.

90%

reduction in initial domain ramp-up time when developers use AI to explore unfamiliar business contexts

200+

pages of regulation summarized and cross-referenced in under an hour — work that previously took a compliance analyst a full week

5+

adjacent business domains that a single analyst or developer can now meaningfully contribute to — up from one or two

Here is the risk. If you ask an LLM “how does billing work?”, you get a textbook answer: charges, payments, invoices. It is accurate at a 10,000-foot level and completely useless for building a billing system. The complexities that matter — contractual adjustments, multi-payer coordination, retroactive rate changes, partial refunds with tax implications — only surface when someone who has lived in the domain knows to ask about them.

The pattern repeats across every industry. Shallow prompts produce shallow models. And shallow models produce systems that work in demos and break in production — because production is where the edge cases live.

Shallow question

How do I calculate the patient’s bill?

AI produces a price × quantity formula. Misses insurance adjustments, contractual write-offs, and prior-authorization rules.

Expert question

Walk me through the adjudication flow for a Blue Cross PPO claim with a secondary Medicare payer, including denial scenarios.

AI surfaces the exact edge cases — coordination of benefits, timely filing limits, and appeal workflows — that would otherwise become production bugs.

The difference is not that AI cannot answer the deep question. It often can — with remarkable accuracy. The difference is that someone has to know to ask it. Subject matter experts do not just carry knowledge; they carry the shape of the knowledge — they know where the gaps are, where the exceptions hide, and where the textbook answer diverges from how things actually work on the ground.

“AI gave us the vocabulary. Our domain expert gave us the grammar. The vocabulary alone would have shipped a system that looked right and calculated wrong.”

— Tech Lead, insurance platform

For decades, the standard model of software delivery has been: a business analyst gathers requirements, a product owner writes stories, and developers implement tickets. Developers see their slice — a single endpoint, a UI component, a data transformation — but rarely the full business process their code serves. They build the bricks without seeing the building.

AI is changing this in two ways that reinforce each other.

Codebase-to-business mapping

AI coding agents can analyze an entire codebase and explain what it does in business terms — not just “this function calculates a value” but “this function applies the contractual discount rate for wholesale customers with volume tiers above 10,000 units.” Developers can finally see the business logic their code implements, even in codebases they did not write.

Gap identification

When AI maps the codebase against domain knowledge, it exposes what is missing. “Your order processing pipeline handles standard orders but has no path for returns with restocking fees.” “Your compliance module checks sanctions lists at onboarding but not on ongoing transactions.” Developers see the gaps before users find them.

The effect is that developers move from being ticket-executors to being participants in the domain. They understand why the system behaves the way it does, which means they can anticipate problems, propose better solutions, and push back on requirements that do not make sense. The quality of the software improves because the people building it understand the world it serves.

“I used to get a Jira ticket that said ‘add a discount field.’ Now the AI agent shows me the entire pricing model, explains why the discount exists, and flags that my implementation would break the volume-tier calculation. I am building better software because I understand the business better.”

— Senior Developer, B2B e-commerce platform

One of the most underappreciated effects of AI-accessible knowledge is domain mobility. When it takes months to learn a new industry, people stay in their lane. When it takes days, they start exploring.

A business analyst who spent five years in logistics can now ramp up on customs regulations, trade compliance, and cross-border taxation in a fraction of the time it would have taken before. A developer who built payment systems can extend into lending, insurance, or treasury management — because AI can bridge the knowledge gap between adjacent domains.

This is not about becoming a domain expert overnight. It is about becoming conversant enough to contribute meaningfully, ask the right follow-up questions, and know when to bring in deeper expertise. The radius of what a single person can understand and influence has expanded.

PaymentsLending & credit

AI summarizes credit scoring models, explains APR calculations, and maps regulatory differences between consumer and commercial lending.

LogisticsCustoms & trade compliance

AI parses HS tariff codes, explains rules of origin, and cross-references free trade agreements — tasks that previously required a licensed customs broker.

Healthcare ITPharmaceutical supply chain

AI explains DSCSA serialization requirements, cold-chain compliance, and DEA reporting — connecting clinical knowledge to distribution logistics.

E-commerceMarketplace compliance

AI surfaces VAT nexus rules, consumer protection laws by jurisdiction, and platform liability frameworks — enabling teams to expand across borders.

If AI can summarize regulations, explain business processes, and draft requirements, what is left for the product owner or business analyst? Everything that matters.

AI handles the what — the factual layer of domain knowledge. Humans provide the so what — the judgment about which facts matter, which edge cases will actually occur in practice, which requirements are negotiable, and which will cause a regulatory audit if implemented incorrectly.

How roles are evolving

Product Owner

Before AI

Gatekeeper of requirements. Translates business needs into user stories. Prioritizes backlog based on stakeholder input.

With AI

Uses AI to explore domain depth independently. Validates AI-generated requirements against real-world constraints. Focuses on strategic trade-offs and prioritization judgment rather than requirements documentation.

Business Analyst

Before AI

Interviews stakeholders, documents processes, writes specifications. Often the only person who understands both the business and the system.

With AI

Uses AI to draft initial process maps and specifications in hours instead of weeks. Spends time validating, stress-testing edge cases, and identifying gaps that AI misses. Becomes a domain quality gate rather than a documentation factory.

Developer

Before AI

Receives tickets, implements features, rarely sees the full business context. Asks clarifying questions when specifications are ambiguous.

With AI

Uses AI to understand the business domain directly. Queries the codebase in business terms. Identifies gaps between implementation and real-world requirements. Contributes to domain modelling, not just code.

The playbook

  1. Pair AI with domain experts, not instead of them. Use AI to accelerate the expert, not to replace them. Let the expert ask the deep questions; let AI do the research legwork. The expert validates, challenges, and refines — that is where accuracy lives.
  2. Give developers business context, not just tickets. Use AI coding agents to generate business-context documentation for the codebase. When developers can query “what business rule does this module implement?” and get a clear answer, they build better software.
  3. Encourage domain mobility. When knowledge is cheap, people can explore adjacent domains. Support BAs and developers in expanding their domain range — it makes teams more resilient, more creative, and less dependent on single points of knowledge failure.
  4. Treat domain validation as a quality gate. AI-generated requirements and specifications should be reviewed by someone who knows the domain — the same way AI-generated code is reviewed by a senior engineer. Build this into your process, not as an afterthought.
  5. Revisit the features you shelved. Many features were deprioritized not because they were unimportant, but because understanding the domain requirements was too expensive. With AI compressing that cost, re-evaluate the backlog with fresh eyes.

The age of AI does not devalue domain expertise. It restructures who holds it, how quickly it can be acquired, and how deeply it can be applied. The people who thrive are not those who memorize facts — AI handles that now. They are the ones who know which facts matter, which questions to ask next, and where the model breaks down.

For organizations building software, the implication is clear: invest in people who understand the world your software serves. Give them AI tools to move faster. And give your developers the business context they have been missing. The systems you build will be better for it.

Building software for a complex domain?

Our teams combine deep domain understanding with AI-native engineering to deliver systems that work in the real world — not just in demos.