AI Agents as ManifestsDefine the Graph. Any Language Runs It.

OpenAPI → Tools GenerationBuilt-in AuthorizationDistributed ExecutionOTel & FinOps Ready
Python
Available now
Graph
Compose any flow
Overlays
Extend without forking

Designed for What Comes After Hello World

From OpenAPI tool generation to distributed execution and cost tracking, Liman designed each one around real production pain points.

Core

OpenAPI → Tools Generation

Automatically generate LLM tools from OpenAPI specifications. Transform any API into agent tools without writing MCP servers.

Security

Service Account Authorization

Built-in authorization with service accounts. State isolation, credential provisioning, and minimal permissions with least privilege principles.

i18n

Dynamic Prompt Localization

Multi-language support with automatic system prompt generation. Increase function calling accuracy across languages.

Observability

OTel & FinOps Out-of-Box

Built-in OpenTelemetry integration with cost tracking. Monitor performance, token usage, and financial metrics automatically.

Flow Control

Condition Expression Language

Custom DSL for intelligent flow control between nodes. Express complex routing decisions declaratively without repetitive conditional logic.

Connectivity

Distributed Edges

Connect nodes via MCP, A2A, HTTP, WebSocket, or shared memory. Build distributed agents across AWS Lambda and processes.

Plugins

Plugin Ecosystem

Extensible plugin system with built-in and custom plugins. Auto-context stitching, evaluation agents, and anomaly detection.

Config

Kustomize Overlays

Layer configurations using Kustomize-like overlays. Perfect for multi-environment deployments and language variants.

See How It Works

Define your agent in YAML. Run it with the SDK.

specs/assistant.yaml
kind: LLMNode
name: assistant
prompts:
  system:
    en: |
      You are a helpful assistant.
      Always be polite and provide clear answers.
main.pyTypeScript SDK coming soon
from liman import Agent
from langchain_openai import ChatOpenAI

agent = Agent(
    "./specs",
    start_node="assistant",
    llm=ChatOpenAI(model="gpt-5.4-mini"),
)

response = await agent.step("Hello!")
print(response)

Stay Updated

Get the latest updates on Liman development, new features, and best practices delivered to your inbox.

No spam, unsubscribe at any time. We respect your privacy.

Get Involved

Participate in specification development and share experiences with other developers

Documentation

Explore guides, API reference, and usage examples to get started quickly

Read Docs

GitHub

Contribute to the code, report issues, and join discussions with the community

View Repository

Blog

Read articles about Liman development, new features, and practical agent patterns

Read Blog