The most advanced AI/AGI/LLM router and orchestrator for the distributed statespaces of the known universe. Model-agnostic. Blazing fast. Built in the open.
Every primitive you need to orchestrate AI across statespaces—local models, cloud APIs, and custom endpoints—unified under one interface.
Semantically route prompts to the optimal model based on cost, capability, latency, and task-type. Policy-driven with fallback chains.
CORECompose complex agent pipelines. Chain, fan-out, debate, and critique across heterogeneous models with built-in state management.
AGENTSManage context windows, vector stores, and episodic memory across a distributed graph of nodes. First-class support for long-horizon tasks.
STATEFull OpenTelemetry tracing across every hop. Cost attribution, token accounting, latency heatmaps, and anomaly detection built in.
OPSUnified API across OpenAI, Anthropic, Gemini, Mistral, Ollama, Groq, Bedrock and any custom endpoint. Switch providers with zero code changes.
APIDeclarative routing policies with OPA/Rego support. Define cost ceilings, capability requirements, compliance rules, and rate limits as code.
POLICYA layered orchestration stack — from raw request ingress to distributed model execution and response assembly.
From foundation to full AGI-grade orchestration — built openly, versioned transparently.
Unified API, provider adapters for major LLMs, basic cost-aware routing, OpenAI-compatible endpoint.
OPA integration, task-type classifiers, declarative routing policies, latency-aware load balancing.
Native agent primitives, fan-out/fan-in patterns, debate and critique loops, shared statespaces.
Vector-backed episodic memory, cross-session context, distributed checkpointing, graph-of-thought execution.
Self-improving routing policies, recursive agent spawning, goal decomposition, and full statespace exploration.
Aeonic is MIT-licensed and built in the open. Whether you're routing two models or orchestrating a galaxy-brain AGI pipeline — we want your ideas.