LLM-Wiki

Definition

A maintained knowledge base where raw sources are turned into readable, linked concept pages that both people and AI systems can query. The LLM-Wiki is the teaching and retrieval layer — not a filing system.

How It Works

Instead of asking an AI to rediscover the same ideas from scattered PDFs, slides, and links every time, the wiki stores the synthesis as durable pages. Each concept page defines an idea, explains why it matters for leaders, links to related concepts, and points to supporting sources.

The design principle: compounding knowledge. Every time a new article, client example, or workshop insight is ingested, the wiki gets better rather than just bigger. The raw sources remain separate. The wiki pages are the interpreted layer.

What It Is For

For workshop participants: revisit ideas after the session, understand the language of AI change leadership, explore how concepts connect.

For the facilitator: a living knowledge base that improves with each delivery, preserves what was learned, and can power a client-facing website or AI query layer.

For AI-assisted retrieval (see RAG): pages already contain the teaching logic — not just raw excerpts. When a client asks “what is the adoption gap?” the system retrieves a page that explains, contextualizes, and connects the concept rather than a fragment from a PDF.

The Format Rule

Pages should read like a clear digital handbook, not a database. Human-readable first. Queryable second.

Connections

RAG Context as Differentiator Iceberg Concept

Tags: knowledge base, wiki, retrieval, client resource