Introducing qubicDB v1.0

Brain-like
LLM Recursive
Database

Neurons fire and decay. Synapses form through Hebbian learning. Memories consolidate during sleep. A database that actually thinks like your brain.

terminal
//Neuron fires when accessed

neuron.Fire()

✓ Energy: 0.7 → 1.0

✓ LastFiredAt: now

//Hebbian learning kicks in

hebbian.OnNeuronFired(neuronID)

✓ Co-activation detected: 3 neurons

✓ Synapse n-42:n-17 strengthened → 0.84

✓ New synapse formed: n-42:n-89

⚡ Fire together, wire together

Built on proven open-source foundations

GoCore runtime
MiniLM-L6-v2GGUF embedding model (384-dim)
llama.cppGGUF inference via purego
SIMDarm64/amd64 cosine similarity
VADERSentiment analysis (govader)
msgpackBinary serialization (.nrdb)
MCPStreamable HTTP tool protocol
MIT LicenseFully open source

Architecture

A database that thinks
like your brain

Not another key-value store. QubicDB implements actual neuroscience principles — neurons, synapses, Hebbian learning, and organic memory lifecycle.

Neurons — Living Memory Units

Each memory is a neuron with energy, activation level, and depth. Neurons fire when accessed, decay over time, and consolidate into long-term storage — just like biological memory.

View docs

Synapses — Hebbian Learning

"Neurons that fire together, wire together." Synapses form automatically between co-activated neurons, strengthen with use, and weaken without it. Pure associative memory.

View docs

Matrix — N-Dimensional Space

Neurons exist in an organic, growing matrix. Related neurons pull closer together over time. The matrix reorganizes itself based on actual usage patterns.

View docs

Spreading Activation

Query one neuron and activation spreads through synaptic connections. Recursive traversal finds related memories you didn't explicitly search for — true associative recall.

View docs

Organic Lifecycle

Decay reduces unused memory energy. Consolidation moves important memories deeper. Pruning removes dead connections. The brain maintains itself automatically.

View docs

Brain States

Each brain transitions through Active → Idle → Sleeping → Dormant states. Consolidation happens during "sleep" — mirroring how biological memory actually works.

View docs
Neurons

Memories that live and breathe

Each memory is a neuron with its own energy level, activation state, and position in N-dimensional space. Neurons fire when accessed, boosting their energy. Over time, unused neurons decay — just like biological memory.

Energy-based activation
Time-based decay
Depth consolidation
N-dimensional positioning
index.ts
// Neuron — the fundamental memory unit
type Neuron struct {
    ID          NeuronID
    Content     string
    
    // Spatial position in N-dimensional matrix
    Position    []float64
    
    // Activation dynamics
    Energy      float64   // Current activation (0.0 - 1.0)
    BaseEnergy  float64   // Resting energy level
    
    // Depth in memory hierarchy
    // 0 = surface/hot, higher = deeper/consolidated
    Depth       int
    
    // Temporal markers
    CreatedAt   time.Time
    LastFiredAt time.Time
    AccessCount uint64
}

// Fire activates the neuron, boosting its energy
func (n *Neuron) Fire() {
    n.Energy = min(1.0, n.Energy + 0.3)
    n.LastFiredAt = time.Now()
    n.AccessCount++
}

// Decay reduces energy based on time elapsed
func (n *Neuron) Decay(rate float64) {
    elapsed := time.Since(n.LastFiredAt).Seconds()
    n.Energy = max(n.BaseEnergy, n.Energy - rate*elapsed/3600)
}
Synapses

Fire together, wire together

Hebbian learning creates and strengthens connections automatically. When neurons fire within the same time window, a synapse forms between them. The more they co-activate, the stronger the connection becomes.

Automatic synapse formation
Co-activation detection
Weight strengthening/weakening
Spatial clustering
index.ts
// HebbianEngine — "Neurons that fire together, wire together"
type HebbianEngine struct {
    matrix             *Matrix
    recentFires        map[NeuronID]time.Time
    coActivationWindow time.Duration  // 5 seconds
    learningRate       float64
    forgettingRate     float64
}

// OnNeuronFired — called whenever a neuron fires
func (h *HebbianEngine) OnNeuronFired(neuronID NeuronID) {
    now := time.Now()
    
    // Find co-activated neurons (fired within window)
    for id, firedAt := range h.recentFires {
        if now.Sub(firedAt) <= h.coActivationWindow {
            // Strengthen or create synapse
            h.strengthenOrCreate(neuronID, id)
        }
    }
    
    h.recentFires[neuronID] = now
}

// Synapses strengthen with use, weaken without it
func (s *Synapse) Strengthen(delta float64) {
    s.Weight = min(1.0, s.Weight + delta)
    s.CoFireCount++
}
Spreading Activation

Recursive associative recall

Query one memory and activation spreads through synaptic connections. The search traverses the neural graph recursively, finding related memories you didn't explicitly search for — true associative intelligence.

Recursive graph traversal
Energy propagation
Synaptic weight influence
Automatic Hebbian reinforcement
index.ts
// Search with spreading activation
func (e *Engine) Search(query string, depth int, limit int) []*Neuron {
    // Start with lexical/vector matches
    seeds := e.findSeeds(query)
    
    // Spread activation through synaptic connections
    activated := make(map[NeuronID]float64)
    for _, seed := range seeds {
        e.spreadActivation(seed.ID, seed.Energy, depth, activated)
    }
    
    // Collect and rank by accumulated activation
    results := e.collectByActivation(activated, limit)
    
    // Fire all accessed neurons (Hebbian reinforcement)
    for _, n := range results {
        n.Fire()
        e.hebbian.OnNeuronFired(n.ID)
    }
    
    return results
}

// Recursive traversal through neural graph
func (e *Engine) spreadActivation(
    id NeuronID, 
    energy float64, 
    depth int, 
    acc map[NeuronID]float64,
) {
    if depth <= 0 || energy < 0.01 { return }
    
    acc[id] += energy
    
    // Spread to connected neurons via synapses
    for _, neighborID := range e.matrix.Adjacency[id] {
        synapse := e.getSynapse(id, neighborID)
        e.spreadActivation(neighborID, energy*synapse.Weight*0.7, depth-1, acc)
    }
}
Organic Lifecycle

The brain maintains itself

Background daemons run continuously: Decay reduces unused neuron energy. Consolidation moves important memories deeper. Pruning removes dead synapses. The brain reorganizes itself — no manual maintenance required.

Active → Idle → Sleeping → Dormant
Sleep consolidation
Automatic decay & pruning
Self-reorganizing matrix
index.ts
// Brain states — like biological sleep cycles
type BrainState int
const (
    StateActive   BrainState = iota  // Recently used
    StateIdle                         // No recent activity
    StateSleeping                     // Consolidation happening
    StateDormant                      // Ready for eviction
)

// Consolidation happens during "sleep"
func (dm *DaemonManager) consolidateDaemon() {
    for dm.waitInterval(dm.consolidateInterval) {
        // Only consolidate sleeping brains
        sleeping := dm.lifecycle.GetSleepingUsers()
        for _, indexID := range sleeping {
            worker := dm.pool.Get(indexID)
            worker.Submit(&Operation{Type: OpConsolidate})
            // 🌙 Index abc: consolidated 47 neurons
        }
    }
}

// Decay daemon reduces energy of unused memories
// Prune daemon removes dead synapses
// Reorg daemon optimizes spatial locality
// Persist daemon flushes to durable storage

Neural Dashboard

Watch your brain
think in real-time

Monitor neuron energy levels, observe synapse formation, track consolidation during sleep cycles, and watch the matrix reorganize itself.

QubicDB Console — Brain Inspector
Neuroscience-Grounded Architecture

Not a vector store.
A living brain.

QubicDB is built on eight interlocking mechanisms derived from computational neuroscience. Each layer compounds the others — producing recall accuracy and contextual depth that flat vector databases cannot replicate.

Engram Neurons

Biologically-inspired memory units

Each memory is a living neuron — not a static row. Neurons carry energy, activation level, N-dimensional spatial position, lifecycle state, and synaptic connections. They fire when accessed, strengthening associated pathways, and decay when unused — exactly like biological engrams.

Energy [0,1]Activation levelSpatial positionLifecycle stateDecay over time

Hebbian Learning

Neurons that fire together, wire together

Synaptic connections form and strengthen automatically based on co-activation. When two neurons are accessed together, their synapse weight increases. Without reinforcement, weights decay. This creates genuine associative memory — not keyword indexes.

Auto-formed synapsesWeight strengtheningWeight decayCo-activation tracking

Spreading Activation

Recursive associative recall

A query activates seed neurons, then activation spreads recursively through synaptic graph — depth-first, weighted by synapse strength and neuron energy. Related memories surface automatically without explicit joins or graph queries. Configurable depth controls recall breadth.

Graph traversalConfigurable depthSynapse-weightedEnergy-gated

Fractal Clustering

Self-similar spatial topology

Neurons organize into fractal clusters in N-dimensional space. Three-phase algorithm: pairwise attraction toward midpoints, centroid pull toward cluster center, repulsion of unconnected neurons. Dense cores → loose shells → separated clusters. The topology mirrors actual usage patterns.

Pairwise attractionCentroid pullRepulsion forceSelf-organizingUsage-driven

Emotional Mapping

VADER sentiment + Ekman emotion model

Every memory is analyzed for emotional valence using VADER (govader). Six Ekman primary emotions are mapped: happiness, sadness, fear, anger, disgust, surprise. At search time, sentiment-matching memories receive a boost multiplier [0.8–1.2] — emotionally congruent recall, like human memory.

VADER analysis6 Ekman emotionsSentiment boost ×1.2Query-neuron matchingCompound scoring

Organic Lifecycle

Active → Idle → Sleeping → Dormant

Background daemons continuously manage neuron health: Decay reduces energy over time (using LastDecayAt, not LastFiredAt), Consolidation moves low-energy neurons to deeper storage, Pruning removes dead synapses, Reorganization rebalances spatial positions. No manual garbage collection.

Decay daemonConsolidationPruningReorganizationLastDecayAt tracking

Hybrid Search Scoring

α·vector + (1-α)·tanh(lexical/10) + sentiment

Search combines three signals: vector cosine similarity via MiniLM-L6-v2 (384-dim, GGUF), lexical string matching normalized with tanh(x/10) for better score separation, and sentiment congruence boost. Alpha (default 0.6) controls vector vs lexical balance. Short queries are expanded and repeated (Springer et al. 2024).

MiniLM-L6-v2α=0.6 defaulttanh(x/10) normQuery expansionSentiment boost

Concurrency Model

Per-index serialized worker goroutines

Each index (user/agent) gets a dedicated BrainWorker goroutine with a serialized operation queue. No cross-index locking. Reads use RWMutex for parallelism; writes serialize through the worker. Background daemons run independently. Deadlock-free by design — repulsion reads under RLock, releases, then writes.

Per-index workersSerialized op queueRWMutex readsDeadlock-freeIndependent daemons

Why Accuracy Matters Here

Design decisions that directly improve recall quality

Recall precision

Spreading activation surfaces semantically related memories that pure vector search misses — associative paths through synaptic graph.

Temporal accuracy

Decay uses actual elapsed time since last decay tick (LastDecayAt), not last access — preventing stale memories from persisting.

Consolidation guard

Energy < 0.5 criterion prevents active neurons from being consolidated — only truly dormant memories move to deep storage.

Score separation

tanh(x/10) normalization gives 0.031→0.143 spread across [0,15] range vs the old x/5 formula — better ranking differentiation.

Emotional congruence

Sentiment-matched memories score up to 20% higher — mirroring how humans recall emotionally congruent information more readily.

Vector alignment

Write and search paths both run CleanText before embedding — ensuring consistent tokenization and preventing embedding drift.

Hybrid Scoring Formula

score = α × cosine(query_vec, neuron_vec) + (1−α) × tanh(lexical_score / 10) × sentiment_boost
// α=0.6 default · sentiment_boost ∈ [0.8, 1.2] · spreading activation applied post-score

Benchmarks

Real numbers,
real hardware

Apple M3 · darwin/arm64 · go test -benchmem

github.com/qubicDB/benchmarks
engine

AddNeuron

Write + embed + Hebbian update

704,893ns/op
engine

Search (1K neurons)

Hybrid lexical + vector + spread activation

2,277,576ns/op
engine

ParallelSearch

Read-lock concurrent search (~5.5× faster)

414,734ns/op
concurrency

BrainWorker AddNeuron

Full worker queue round-trip

1,394,905ns/op
concurrency

BrainWorker Search

Worker queue + engine search

1,759,222ns/op
concurrency

Parallel Async Submit

Queue submit only — near-zero overhead

80ns/op
Reproduce locally
# Engine microbenchmarks (no model needed)
go test -run '^$' -bench 'BenchmarkMatrixEngine' \
  -benchmem ./pkg/engine

# Worker/concurrency path
go test -run '^$' -bench 'BenchmarkBrainWorker' \
  -benchmem ./pkg/concurrency

# Live vector benchmarks (requires GGUF model)
QUBICDB_VECTOR_MODEL_PATH=./dist/MiniLM-L6-v2.Q8_0.gguf \
  go test -run '^$' -bench . -benchmem ./pkg/e2e
Open Source · MIT License

Everything is open.
Self-host anywhere.

REST API, MCP endpoint, persistence layer, lifecycle daemons, admin UI, CLI, and all SDKs — fully open source under MIT license. No cloud lock-in.

Quickstart
terminal
# Pull and run
docker pull qubicdb/qubicdb:1.0.0
docker run -d -p 6060:6060 \
  -e QUBICDB_ADMIN_USER=admin \
  -e QUBICDB_ADMIN_PASSWORD=changeme \
  qubicdb/qubicdb:1.0.0

# Admin UI
docker pull qubicdb/qubicdb-ui:1.0.0
docker run -d -p 8080:80 qubicdb/qubicdb-ui:1.0.0

# Or build from source
git clone https://github.com/qubicDB/qubicdb
cd qubicdb && go build ./cmd/qubicdb/
View on GitHub

Give your AI
a real brain

Not just storage — actual neural architecture. Neurons fire and decay, synapses form through learning, memories consolidate during sleep.

Open source · MIT License · Built with neuroscience principles