Overview
AICP (AI Control Plane) is a governance layer for LLM applications that captures every AI decision as a queryable Decision Record with complete lineage tracking. If you can't trace what influenced a decision, you can't govern it — AICP solves this by sitting between your code and LLM providers.
What You Get
- Automatic Capture: Every LLM call generates a Decision Record with full context (inputs, outputs, costs, latency, model version)
- Lineage Graph: Neo4j-powered graph database tracks relationships between decisions, enabling impact analysis and compliance auditing
- Policy Enforcement: YAML-based policies for pre/post-execution validation, PII detection, cost controls, and human-in-the-loop approvals
- Cost Observability: Real-time cost tracking across providers (OpenAI, Anthropic, Google) with aggregations by user, feature, or time period
How It Works
Flow:
- Your app sends LLM requests to AICP Gateway instead of directly to providers
- Gateway applies pre-execution policies (PII checks, cost limits, approvals)
- Request is proxied to the LLM provider (OpenAI, Anthropic, etc.)
- Response is evaluated against post-execution policies
- Complete Decision Record stored in TimescaleDB with lineage graph in Neo4j
- Response returned to your app (transparent proxy)
Architecture
AICP uses a gateway proxy pattern — your application makes a one-line SDK change to point at the AICP Gateway URL instead of the provider's URL. The gateway handles policy evaluation, telemetry, and storage, then proxies the request to the actual provider.
Before:
from openai import OpenAI
client = OpenAI()
After:
from aicp import OpenAI
client = OpenAI() # Auto-routes through AICP Gateway
The API surface is 100% compatible with native SDKs — no refactoring required.
Supported Providers
| Provider | SDK Support | Models | Status |
|---|---|---|---|
| OpenAI | Python, TypeScript | GPT-4, GPT-3.5, o1, o3 | ✅ Production |
| Anthropic | Python, TypeScript | Claude Opus, Sonnet, Haiku | ✅ Production |
| Google AI | Via LiteLLM | Gemini, PaLM | ✅ Beta |
| Any LiteLLM Provider | Custom integration | 100+ models | ⚠️ Advanced |
AICP uses LiteLLM internally for multi-provider routing, enabling support for any LLM that LiteLLM supports.
Key Differentiators
Decision Records vs. Logs
Standard logging gives you raw text. Decision Records give you structured, queryable data with:
- Input/output preservation (full messages, not truncated)
- Automatic cost calculation per request
- Reversibility tier classification (high/medium/low risk)
- Policy evaluation results embedded
- Graph relationships to upstream/downstream decisions
Lineage Graph vs. Trace Trees
OpenTelemetry traces show execution paths. AICP's lineage graph shows causality:
- Which user action triggered this decision?
- Which decisions influenced this output?
- What impact would changing this decision have downstream?
- Answer natural language queries: "Find all decisions that affected user X's report"
Proactive vs. Reactive Governance
Most LLM observability is retroactive. AICP policies are enforced at runtime:
- Block requests before they hit the provider (prevent PII leakage)
- Require human approval for high-risk actions (financial advice, code deletion)
- Warn on policy violations without blocking (soft enforcement)
- Tag decisions for audit trails
Next Steps
Or explore core concepts:
- Decision Records — What gets captured
- Lineage Graph — How relationships work
- Policies — Governance configuration
- Approvals — Human-in-the-loop workflow