Skip to content

JHdehao/gemini-mem-v3

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gemini-Mem V3: The Self-Evolving Cognitive Engine

"Experience is the architect of the mind."

Gemini-Mem V3 is not a search index; it is a self-evolving cognitive substrate for the Gemini CLI. Inspired by the Evolver architecture and high-frequency agent workflows, V3 bridges the gap between ephemeral interactions and permanent engineering wisdom.


🧬 The "Evolver" Feedback Loop

V3 implements a four-stage evolutionary cycle that operates silently in the background:

  1. Ingestion: Capture raw dialogue via zero-latency asynchronous hooks.
  2. Extraction: Gemini 3 series models distill the "What" (Facts), the "Who" (Entities), and the "How" (SOPs).
  3. Recognition: The engine detects recurring technical patterns (Concepts) across multiple sessions.
  4. Distillation: Upon reaching a "Success Threshold," raw memories are crystallized into formal .md skills.

⚡ Hardcore Architectural Features

1. Procedural Memory: The SOP Engine

Traditional memory systems fail because they recall facts when you need processes. V3 extracts Standard Operating Procedures (SOPs) from your successes.

  • Micro-Workflow Capture: Automatically identifies the exact shell commands, file patches, and config tweaks that solved a bug.
  • Example Output:
    "sop": ["1. Bypass build isolation using --no-build-isolation", "2. Patch setup.py to hardcode sm_86", "3. Rebuild native modules"]

2. Knowledge Solidification (SkillDistiller)

The SkillDistiller is the heart of the engine's "Self-Learning" capability.

  • Pattern Matching: Monitors the concepts_json field in the SQLite backend.
  • The 2-Hit Rule: Once a complex technical concept (e.g., "Docker Optimization") is successfully validated twice, the Distiller triggers a synthesis event.
  • Auto-Skill Creation: Generates a professional-grade SKILL.md in .gemini/skills/, creating a project-specific library of "Best Practices."

3. Asynchronous Nervous System

To prevent AI "brain fog" (UI lag), V3 decouples cognition from conversation.

  • Task Offloading: The AfterAgent hook merely appends to a pending_tasks queue (WAL-mode SQLite).
  • Background Worker: A dedicated process handles the heavy LLM lifting, ensuring your CLI remains responsive regardless of memory complexity.

4. Layered Semantic Retrieval

V3 uses a tiered approach to memory recall, ensuring that the most relevant "wisdom" is always injected:

  • Tier 1: Entities: Matches specific symbols (functions, class names) currently in scope.
  • Tier 2: Concepts: Injects architectural patterns relevant to the current task.
  • Tier 3: SOPs: Provides actionable steps for the current problem based on historical wins.

🏗️ Technical Specifications

  • Kernel: Optimized for Gemini 3 Series (leveraging high-token reasoning).
  • Storage Layer:
    • SQLite FTS5: High-speed full-text indexing for titles, summaries, and SOPs.
    • JSON-Relation Mapping: Hierarchical storage of concepts and entities.
  • Environment: Native Node.js / Bun compatibility with minimal dependency footprint.

🚀 Deployment & Evolution

Installation

gemini extension install .

Self-Learning in Action

  1. Work: Solve a complex problem with the Gemini CLI.
  2. Evolve: V3 background worker extracts the SOP and stores the concept.
  3. Crystallize: Repeat a similar task. The SkillDistiller automatically creates a Skill file.
  4. Recall: Next time you start a session, the relevant Skill is automatically injected into the context.

🛡️ Engineering Standards

  • Local-First: Zero telemetry for your data. Your wisdom stays in your project.
  • Anti-Leakage: Advanced regex redaction to ensure credentials never enter the long-term memory pool.

📜 License

GPL-3.0
Built to protect open-source innovation and prevent architectural plagiarism.


Designed for engineers who build things that last. Driven by an engine that never stops learning.

Releases

No releases published

Packages

 
 
 

Contributors