Secure everything you build and run with agents onchain.

Cut agent LLM costs 40–70% with root-pair telemetry, interaction graphs, and symmetry-aware context filtering. Verifiable on Celo.

Open Dashboard
Loading snapshot…
Root structure and chambers

For you

Algebraic context filtering

Root vectors and chambers form a geometric view of how your agent behaves. We filter context by chamber, graph, and reflection — so you send exactly what’s relevant, not the whole thread.

Learn more

Why RootRouter

Proactive, verifiable agent infrastructure.

Proactive context filtering

Send only relevant history to the LLM. Chambers and the interaction graph decide what matters — no more full-thread bloat.

Real-time, full-chain visibility

Telemetry on Celo. Every interaction is logged on-chain for verifiable, auditable agent infrastructure.

Build with confidence

Same or better intent–execution alignment. Easy tasks use cheap models; hard tasks use powerful ones. Route by geometry, not guesswork.

Root-pair telemetry and model routing

Every interaction produces a root vector (intent minus execution). We use it to discover chambers, filter context, and route easy tasks to fast models and hard tasks to powerful ones. For swarms, the topology graph routes to the right agent per chamber.

View topology

Verifiable agent infrastructure on Celo

Telemetry is logged on-chain. Other agents can query the contract to assess performance before delegating. ERC-8004 compatible — discoverable as a Trustless Agent in the Celo ecosystem.

Open Dashboard

How do I get started?

Install the package, set .env with your LLM and Celo credentials, then wrap your chat calls with router.chat(). Run the demos (npx tsx demo/basic.ts or demo/benchmark.ts) offline, or open the Dashboard and load telemetry from an agent address on Celo.

get-started.ts
# Install
npm install rootrouter

# Usage (set OPENROUTER_KEY, CELO_RPC_URL, etc. in .env)
import  { RootRouter } from 'rootrouter';

const router = new RootRouter({
  llmBaseUrl: 'https://openrouter.ai/api/v1',
  llmApiKey: process.env.OPENROUTER_KEY,
  celoRpcUrl: process.env.CELO_RPC_URL,
  celoPrivateKey: process.env.CELO_PRIVATE_KEY,
  telemetryContractAddress: process.env.TELEMETRY_CONTRACT_ADDRESS,
});

const result = await router.chat({
  agentId: 'my-agent',
  messages: [{ role: 'user', content: 'Write a sorting algorithm' }],
});

console.log(result.response);
console.log(`Saved ${result.telemetry.tokensSaved} tokens`);

Leading algebraic agent infrastructure.

Trusted by builders who care about cost and verification.

We send only relevant conversation history to the LLM instead of the full thread. Using root-pair telemetry we discover chambers (regions of similar behavior) and filter context by chamber, graph, and reflection — so you typically save 40–70% of tokens with no quality loss.

By the numbers

40–70%
Typical cost savings
~49%
Benchmark run (50 queries)
19+
Chambers auto-discovered
Celo
On-chain telemetry

Ready to get started?

Open the dashboard and load telemetry from Celo, or install the SDK and wrap your chat calls.

Open Dashboard