Distributed Library
Knowledge flows between trusted peers, enriching every participant without central coordination.
Open Source · Apache 2.0 · Federated · Sovereign
From Latin liber — free, and book. Free minds sharing free knowledge. A federation hub that enables independent MoE Sovereign AI instances to exchange knowledge securely — without sacrificing sovereignty or control.
The Concept
Every MoE Sovereign node builds its own local knowledge base through daily interaction. MoE Libris lets these nodes voluntarily share curated knowledge entries with other instances across the network, creating a distributed library of collective intelligence.
Knowledge flows between trusted peers, enriching every participant without central coordination.
Each node retains complete control over what it publishes and what it accepts. Your instance, your rules.
No forced synchronization. Federation requires mutual agreement between both nodes.
Inspired by the Fediverse — decentralized networks that thrive without corporate control.
Capabilities
Built for trust, transparency, and autonomy.
Every knowledge entry passes through a multi-stage validation pipeline before it can be shared or ingested. No unreviewed content enters your system.
Incoming knowledge entries land in a review queue where administrators can inspect, approve, or reject each item before integration.
Rate limiting, content validation, and reputation scoring protect the network from spam, poisoning attacks, and malicious actors.
Federation requires mutual agreement. Both nodes must explicitly approve the connection before any data flows between them.
Server discovery happens through a public Git registry. No central directory server — just a transparent, version-controlled list of participating nodes.
Each node decides what to share, who to federate with, and what to accept. Your instance, your rules. No forced assimilation.
System Design
How MoE Libris connects sovereign nodes.
Security model: Bundles are transmitted over mTLS (encrypted in transit). There is no end-to-end encryption — the Pre-Audit pipeline reads the JSON-LD payload, and an Admin reviews entries in the Audit Queue before graph merge. Privacy scrubbing (PII, secrets, hostnames) runs at the sender node before the bundle leaves the originating instance. Privacy Scrubber ›
Nodes find each other through the public Git registry. Each entry contains the node's API endpoint and public key.
Two nodes establish a bilateral federation agreement. Both administrators must explicitly approve the connection.
Nodes push curated knowledge entries to their federation partners and pull new entries from them. All transfers are authenticated and encrypted.
Incoming entries pass through the pre-audit pipeline and land in the audit queue. Administrators approve, reject, or flag each item.
Philosophy
Sovereignty through voluntary cooperation.
MoE Libris draws inspiration from the Fediverse — the family of federated social platforms like Friendica and Mastodon that proved decentralized networks can thrive without corporate control.
Participation is entirely voluntary. Each node maintains full sovereignty over its knowledge base, its federation partners, and its acceptance criteria. There is no central server, no mandatory synchronization, no algorithmic curation imposed from above.
"No forced assimilation — like Unimatrix Zero from Star Trek, sovereign minds choosing to connect while preserving their individuality."
The result is a network where knowledge flows freely between trusted peers, where each node strengthens the collective without surrendering autonomy, and where the value of the whole emerges from the sovereignty of each part.
Under the Hood
Built on proven, open-source foundations.
High-performance async Python framework for the federation API endpoints.
Reliable relational storage for federation state, audit logs, and node metadata.
Graph database for knowledge entries, enabling semantic relationships and traversal queries.
In-memory data store for caching, rate limiting, and real-time federation state.
Containerized deployment for reproducible, portable federation nodes.
Core implementation language, consistent with the MoE Sovereign ecosystem.
Docker and Docker Compose. 2 GB RAM minimum. No GPU required. Works on any x86_64 or ARM64 host.
Ecosystem
MoE Libris is the federation layer of the MoE Sovereign AI ecosystem.
The main MoE Sovereign project — distributed AI inference on your own hardware.
Full documentation: architecture, API reference, integration guides, and administration manual.
German version of the MoE Sovereign project website.
Source code, issues, and contribution guidelines for the federation protocol.