fd1791620edeb5468053abd2108591f3b34a5cd8
MCP Servers.md
new file mode 100644
index 0000000..99598d0
@@ -0,0 +1,32 @@
+---
+visibility: public
+---
+
+# MCP Servers for LLM Wiki Maintenance
+
+MCP-native implementations that any LLM client can connect to for wiki maintenance.
+
+## Available Servers
+
+| Repo | Stars | Description |
+|------|------:|-------------|
+| [xoai/sage-wiki](https://github.com/xoai/sage-wiki) | 341 | Captures knowledge directly from AI conversations. Runs as MCP server. |
+| [lucasastorian/llmwiki](https://github.com/lucasastorian/llmwiki) | 192 | Upload docs, connect Claude account via MCP, have it write your wiki. Open source Karpathy pattern. |
+| [iamsashank09/llm-wiki-kit](https://github.com/iamsashank09/llm-wiki-kit) | 28 | Available on [Glama](https://glama.ai/mcp/servers/iamsashank09/llm-wiki-kit). Supports PDFs, URLs, YouTube ingestion. |
+
+## How MCP Fits the Pattern
+
+MCP servers expose the [[The Pattern|three operations]] (ingest, query, lint) as tools that any compliant LLM client can call. This means:
+
+- **Claude Code** can maintain a wiki via MCP while working on code
+- **Cursor/Windsurf** can reference wiki context during coding
+- **Any MCP client** gets wiki maintenance for free
+
+## Comparison: MCP Server vs Claude Code Skill
+
+| Aspect | MCP Server | Claude Code Skill |
+|--------|-----------|-------------------|
+| Client compatibility | Any MCP client | Claude Code only |
+| Setup | Config in MCP settings | Drop skill file in project |
+| Isolation | Separate process | Same process |
+| Best for | Multi-client workflows | Claude Code power users |
\ No newline at end of file
The Pattern.md
new file mode 100644
index 0000000..c849760
@@ -0,0 +1,39 @@
+---
+visibility: public
+---
+
+# The Karpathy LLM Wiki Pattern
+
+**Source:** [Karpathy's gist](https://gist.github.com/karpathy/442a6bf555914893e9891c11519de94f) (14k+ stars)
+
+A structured markdown knowledge base **maintained by an LLM**, not just queried by one.
+
+## Three Layers
+
+### 1. Raw Sources (`raw/`)
+Immutable documents — articles, PDFs, transcripts, images. The LLM reads but never modifies these.
+
+### 2. Wiki (`wiki/`)
+LLM-generated markdown pages: summaries, entity pages, concept pages, comparisons. The LLM owns this layer entirely.
+
+### 3. Schema (`CLAUDE.md` / `AGENTS.md`)
+Configuration telling the LLM how to structure, ingest, format, and cross-reference.
+
+## Three Operations
+
+| Operation | What it does |
+|-----------|-------------|
+| **Ingest** | Fetch source into `raw/`, ripple-update wiki pages, update index + cross-references |
+| **Query** | Search and answer with citations |
+| **Lint** | Fix broken links, flag contradictions, find orphans, mark stale content |
+
+## Key Mechanics
+
+- Ingesting one source can update **10-15 wiki pages** (ripple updates)
+- `index.md` = content catalog; `log.md` = append-only activity record
+- Pages have frontmatter: title, type (concept/entity/source-summary/comparison), sources, related, confidence
+- "The LLM is the programmer; Obsidian is the IDE; the wiki is the codebase"
+
+## Why It Matters
+
+Personal wikis die because maintenance is boring. LLMs don't get bored. The pattern solves the fundamental problem of knowledge base decay by making the AI responsible for keeping everything consistent, cross-referenced, and up to date.
\ No newline at end of file
Tools & Implementations.md
new file mode 100644
index 0000000..120cbd5
@@ -0,0 +1,50 @@
+---
+visibility: public
+---
+
+# Tools & Implementations
+
+GitHub repos implementing the [[The Pattern|Karpathy LLM Wiki pattern]], ranked by stars (as of 2026-04-10).
+
+## Top Tier (1000+ stars)
+
+| Stars | Repo | Type | Notes |
+|------:|------|------|-------|
+| **1,542** | [SamurAIGPT/llm-wiki-agent](https://github.com/SamurAIGPT/llm-wiki-agent) | Multi-agent CLI | Works with Claude Code, Codex, Gemini CLI. No API key needed. **Community leader.** |
+
+## Mid Tier (300-500 stars)
+
+| Stars | Repo | Type | Notes |
+|------:|------|------|-------|
+| 481 | [nashsu/llm_wiki](https://github.com/nashsu/llm_wiki) | Desktop app | Cross-platform, docs to wiki |
+| 477 | [sdyckjq-lab/llm-wiki-skill](https://github.com/sdyckjq-lab/llm-wiki-skill) | Claude Code skill | Chinese community, multi-platform |
+| 347 | [AgriciDaniel/claude-obsidian](https://github.com/AgriciDaniel/claude-obsidian) | Obsidian companion | Claude + Obsidian integration |
+| 341 | [xoai/sage-wiki](https://github.com/xoai/sage-wiki) | MCP server | Captures knowledge from AI conversations |
+| 306 | [atomicmemory/llm-wiki-compiler](https://github.com/atomicmemory/llm-wiki-compiler) | Compiler | Raw sources in, interlinked wiki out |
+
+## Growing (100-300 stars)
+
+| Stars | Repo | Type | Notes |
+|------:|------|------|-------|
+| 264 | [Ar9av/obsidian-wiki](https://github.com/Ar9av/obsidian-wiki) | Framework | AI agents build/maintain Obsidian wiki |
+| 209 | [Astro-Han/karpathy-llm-wiki](https://github.com/Astro-Han/karpathy-llm-wiki) | Skill | One skill to build your own Karpathy-style LLM wiki |
+| 192 | [lucasastorian/llmwiki](https://github.com/lucasastorian/llmwiki) | MCP server | Upload docs, connect Claude, it writes your wiki |
+| 159 | [nvk/llm-wiki](https://github.com/nvk/llm-wiki) | Multi-agent | Research, ingestion, compilation, querying |
+| 147 | [ussumant/llm-wiki-compiler](https://github.com/ussumant/llm-wiki-compiler) | Claude Code plugin | Compiles markdown, ~90% context reduction |
+
+## Early Stage (< 100 stars)
+
+| Stars | Repo | Type |
+|------:|------|------|
+| 77 | [Ss1024sS/LLM-wiki](https://github.com/Ss1024sS/LLM-wiki) | Baseline implementation |
+| 51 | [kfchou/wiki-skills](https://github.com/kfchou/wiki-skills) | Claude Code skills |
+| 46 | [Pratiyush/llm-wiki](https://github.com/Pratiyush/llm-wiki) | Multi-platform |
+| 29 | [MehmetGoekce/llm-wiki](https://github.com/MehmetGoekce/llm-wiki) | L1/L2 cache + Logseq |
+| 28 | [iamsashank09/llm-wiki-kit](https://github.com/iamsashank09/llm-wiki-kit) | MCP server on Glama |
+| 28 | [toolboxmd/karpathy-wiki](https://github.com/toolboxmd/karpathy-wiki) | Claude Code skills |
+| 27 | [ekadetov/llm-wiki](https://github.com/ekadetov/llm-wiki) | Obsidian-backed plugin |
+
+## Packages
+
+- **npm:** `llm-wiki` — CLI with multi-provider support (OpenAI, Anthropic, DeepSeek, Ollama), auto-linking, wiki linting
+- **PyPI:** `llmwiki` — Python package, no vector DB or embeddings needed
\ No newline at end of file
WikiHub Opportunity.md
new file mode 100644
index 0000000..9378562
@@ -0,0 +1,41 @@
+---
+visibility: public
+---
+
+# WikiHub Opportunity
+
+Every implementation in the [[Tools & Implementations]] list is **local-first**. None of them offer:
+
+- Hosting and publishing
+- Social features (fork, star, discover)
+- Access control and sharing
+- Git-backed collaboration
+- Agent-native API + MCP server
+
+WikiHub is the missing layer.
+
+## The Gap
+
+| What exists | What's missing |
+|-------------|---------------|
+| Local wiki maintenance tools | A place to **publish** the wiki |
+| CLI/skill for ingesting sources | A platform for **discovering** others' wikis |
+| Personal knowledge bases | **Social knowledge** — forking, starring, profiles |
+| Git-backed storage (local) | **Hosted git** with Smart HTTP push/clone |
+
+## WikiHub's Position
+
+> "GitHub for LLM wikis"
+
+Just as GitHub didn't replace `git` — it made git repos social, discoverable, and hosted — WikiHub doesn't replace the maintenance tools. It's where the maintained wikis **live**.
+
+## Potential Integration
+
+WikiHub could offer the maintenance skill as a **built-in capability**:
+
+1. User creates a wiki on WikiHub
+2. WikiHub's MCP server already exposes 13 tools (read, write, search, etc.)
+3. A "maintain" operation could be added that runs ingest/lint on a schedule or on-demand
+4. Agents connect via MCP, push raw sources, and WikiHub's built-in agent maintains the wiki pages
+
+This would make WikiHub the **full-stack platform** — not just hosting, but active maintenance.
\ No newline at end of file
index.md
new file mode 100644
index 0000000..22b0dce
@@ -0,0 +1,22 @@
+---
+visibility: public
+---
+
+# LLM Wiki Ecosystem
+
+Research on the tools, skills, and MCP servers implementing Andrej Karpathy's **LLM Wiki** pattern — where an LLM maintains a structured markdown knowledge base from raw sources.
+
+## Pages
+
+- [[The Pattern]] — Karpathy's 3-layer architecture (raw → wiki → schema)
+- [[Tools & Implementations]] — GitHub repos ranked by community adoption
+- [[MCP Servers]] — MCP-native implementations for LLM clients
+- [[WikiHub Opportunity]] — How WikiHub differentiates as the hosting layer
+
+## Key Insight
+
+All existing implementations are **local-first** tools. None offer hosting, sharing, or social features. WikiHub is the missing layer — "GitHub for LLM wikis" — where these wikis get published, forked, and discovered.
+
+---
+
+*Research conducted 2026-04-10. Star counts are snapshots.*
\ No newline at end of file