Your research disappears into chat sessions. Vaultweaver makes it compound.
Install • Quick Start • How It Works • Commands • Why Not RAG?
Most LLM token spend goes into throwaway chat responses -- you ask, get an answer, close the tab. Nothing persists.
Vaultweaver flips this. It compiles raw sources into an interconnected wiki that grows richer with every source you add. The wiki becomes a persistent, compounding artifact -- not a disposable conversation.
|
|
git clone https://github.com/Apexlify/vaultweaver.git
cd vaultweaver
bash install.shThat's it. Start a new Claude Code session and you have /wiki.
Manual install
cp -r . ~/.claude/skills/vaultweaver
cp SKILL.md ~/.claude/commands/wiki.md/wiki init "My Research"
Drop files into raw/ -- papers, articles, notes, screenshots, anything.
/wiki ingest
/wiki compile
/wiki query "How does X relate to Y?"
Or skip the commands -- just talk:
"I dropped 3 papers in raw, process them"
"What do we know about attention scaling?"
"Check the wiki for issues"
Vaultweaver triggers automatically.
flowchart LR
subgraph YOU
A[Drop files\nin raw/]
B[Ask\nquestions]
C[Say\n'file it']
end
subgraph CLAUDE
D[Ingest\n10-15 pages\nper source]
E[Query\nwith citations]
F[Compound\ninto wiki]
end
subgraph WIKI["wiki/"]
G[sources/]
H[concepts/]
I[entities/]
J[queries/]
K[index.md]
end
A --> D
D --> G & H & I & K
B --> E
E --> B
C --> F
F --> J & K
style YOU fill:#1a1a2e,stroke:#e94560,color:#e0e0e0
style CLAUDE fill:#1a1a2e,stroke:#53a8b6,color:#e0e0e0
style WIKI fill:#1a1a2e,stroke:#7c3aed,color:#e0e0e0
| Layer | What | Owner |
|---|---|---|
raw/ |
Source documents -- papers, articles, notes, images | You (immutable) |
wiki/ |
Compiled knowledge -- summaries, concepts, entities, queries | Claude (writes everything) |
SKILL.md |
Schema -- conventions, triggers, workflows | Vaultweaver |
| Command | Description | LLM |
|---|---|---|
/wiki init "Topic" |
Create wiki structure | -- |
/wiki ingest |
Process raw/ sources -- 10-15 pages per source | Yes |
/wiki compile |
Build missing concept + entity articles | Yes |
/wiki query "Q" |
Answer from wiki with [[citations]] |
Yes |
/wiki lint |
Find broken links, gaps, contradictions | Yes |
/wiki search "term" |
BM25 ranked search (instant, local) | -- |
/wiki status |
File counts + recent activity | -- |
/wiki serve |
Web search UI on localhost:5000 | -- |
Two hooks run automatically -- zero commands needed:
sequenceDiagram
participant U as You
participant C as Claude Code
participant W as Wiki
Note over C: Session Start
C->>W: check-wiki-drift.sh
W-->>C: "2 unprocessed files in raw/"
C-->>U: Drift alert
Note over U: During session...
U->>W: Drop new files in raw/
U->>C: Ask questions, do work
Note over C: Session End
C->>W: Auto-ingest new raw/ files
C->>W: Update concepts, entities, index
W-->>C: Wiki updated silently
Each source goes through a 7-step pipeline, touching 10-15 wiki pages:
flowchart TD
A[raw/paper.md] --> B[READ\nExtract claims, entities,\nconcepts, data points]
B --> C[WRITE SOURCE\nwiki/sources/paper.md]
C --> D[CREATE CONCEPTS\nwiki/concepts/*.md]
C --> E[CREATE ENTITIES\nwiki/entities/*.md]
C --> F[CROSS-REFERENCE\nUpdate existing pages]
C --> G[UPDATE INDEX\nwiki/index.md]
C --> H[APPEND LOG\nwiki/log.md]
style A fill:#e94560,stroke:#e94560,color:#fff
style B fill:#1a1a2e,stroke:#53a8b6,color:#e0e0e0
style C fill:#1a1a2e,stroke:#7c3aed,color:#e0e0e0
style D fill:#1a1a2e,stroke:#53a8b6,color:#e0e0e0
style E fill:#1a1a2e,stroke:#53a8b6,color:#e0e0e0
style F fill:#1a1a2e,stroke:#f59e0b,color:#e0e0e0
style G fill:#1a1a2e,stroke:#10b981,color:#e0e0e0
style H fill:#1a1a2e,stroke:#10b981,color:#e0e0e0
Why 10-15 pages? A paper mentions dozens of concepts. If you only write a summary, you get an isolated note. Create concept pages, entity pages, and cross-references, and you get a knowledge graph where every future source plugs into existing connections. Your 20th paper is 10x more valuable than your 1st.
User: /wiki query "What are the tradeoffs between
exact and approximate attention?"
Claude: Based on [[concepts/flash-attention]],
[[concepts/linformer]], and [[concepts/performer]]:
| Method | Exact? | Complexity | Memory |
|-----------------|--------|------------|--------|
| Full attention | Yes | O(n^2) | O(n^2) |
| Flash Attention | Yes | O(n^2) | O(n) |
| Linformer | No | O(n) | O(n) |
| Performer | No | O(n) | O(n) |
Want me to file this into the wiki?
User: yes
Claude: --> wiki/queries/exact-vs-approximate-attention.md
Filed answers compound. Next time you query about attention, this synthesis is already part of the wiki.
/wiki lint runs seven checks:
| Check | What it finds | Auto-fix? |
|---|---|---|
| Contradictions | Conflicting claims across pages | Reports for judgment |
| Stale claims | Superseded by newer sources | Partial |
| Orphan pages | No inbound [[links]] | Adds cross-refs |
| Missing concepts | Mentioned often, no article | Creates stubs |
| Broken links | [[links]] to non-existent pages | Creates stubs |
| Data gaps | Partially covered topics | Suggests sources |
| Thin pages | Too little content | Partial |
Lint tells you what to research next. It turns the wiki from a static collection into a research roadmap.
Link normalization rules
All of these resolve to the same page:
| Input | Resolves to |
|---|---|
[[gpt-series]] |
gpt-series.md |
[[GPT_series]] |
gpt-series.md |
[[GPT Series]] |
gpt-series.md |
[[concepts/gpt-series]] |
gpt-series.md |
[[gpt-series|GPT Family]] |
gpt-series.md |
Instant local search -- no LLM, no API calls:
python search.py wiki/ "attention mechanism" # CLI search
python search.py wiki/ --json "transformer" # JSON output
python search.py wiki/ --serve # Web UI
python search.py wiki/ --stats # Index statsWeb UI at localhost:5000 -- dark theme, score badges, click-through to pages.
| RAG | Vaultweaver | |
|---|---|---|
| Structure | Flat chunks in a vector DB | Interconnected [[wiki-links]] graph |
| Cross-refs | None -- chunks are isolated | Every page links to related pages |
| Contradictions | Hidden (chunks conflict silently) | Explicitly noted in articles |
| Compounding | Each query starts from scratch | Answers filed back improve future queries |
| Browsability | Need a special UI | Open in Obsidian or any editor |
| Infrastructure | Vector DB + embeddings | Plain markdown files |
| Best for | 1K-1M documents | 10-500 sources (personal research) |
my-project/
|
|-- raw/ You own this (immutable)
| |-- paper-on-transformers.md
| |-- blog-post-on-bert.md
| +-- architecture-diagram.png
|
|-- wiki/ Claude owns this
| |-- index.md Categorized page catalog
| |-- log.md Parseable activity log
| |-- overview.md High-level synthesis
| |-- schema.md Wiki conventions
| |-- sources/ One summary per raw file
| |-- concepts/ Encyclopedic articles
| |-- entities/ People, orgs, tools
| +-- queries/ Filed Q&A
|
+-- search.py BM25 search engine
Open wiki/ in Obsidian for:
|
Graph View -- visual knowledge map showing all concepts and their Backlinks -- every page shows what links to it |
Live Reload -- watch the wiki grow as Claude writes pages Dataview -- query YAML frontmatter across all pages |
Every page has structured frontmatter:
---
title: "Self-Attention"
type: concept
tags: [attention, mechanism, transformer]
created: 2026-04-07
sources: [raw/transformers.md, raw/bert-paper.md]
---| Element | Convention |
|---|---|
| Wikilinks | [[hyphenated-lowercase]] -- link liberally |
| First paragraph | 1-2 sentence summary of the page |
| Citations | According to [[sources/paper-name]]... |
| Contradictions | Always noted explicitly, never silently resolved |
| Sources section | Every concept/entity page ends with ## Sources |
| Log entries | ## [YYYY-MM-DD] operation | title -- parseable with grep |
vaultweaver/
|-- SKILL.md The brain: triggers, decision tree, operations
|-- hooks.json Auto-pilot: Stop + SessionStart hooks
|-- settings.json Registry metadata: name, version, tags
|-- scripts/
| |-- check-wiki-drift.sh Detects unprocessed raw/ files
| +-- search.py BM25 engine + Flask web UI
|-- references/
| +-- operations.md Detailed 10-step workflows
|-- assets/
| +-- banner.svg README header graphic
|-- install.sh One-command install
|-- uninstall.sh Clean removal
|-- LICENSE MIT
+-- README.md You are here
bash uninstall.shRemoves the skill. Your wiki data is never touched.
Built by Apexlify
|
The AI backbone powering wiki compilation |
The perfect viewer for interconnected markdown |
The open-source community that set the standard |
If Vaultweaver is useful, give it a star.
Built with Claude Code. Viewed in Obsidian. Knowledge that compounds.