What is Chainless?

Chainless is a lightweight, modular framework to build task-oriented AI agents and orchestrate them in intelligent flows.

It allows you to define agents, tools, and tasks in a composable and scalable way — ideal for both personal projects and production-grade pipelines.

pip install chainless
from chainless import Tool, Agent, TaskFlow

def greet(name: str):
    return f"Hello, {name}!"

tool = Tool("Greeter", "Greets the user", greet)

agent = Agent("WelcomeAgent", tools=[tool])
flow = TaskFlow("GreetFlow").add_agent("WelcomeAgent", agent)

flow.step("WelcomeAgent", input_map={"input": "{{input}}"})

print(flow.run("Alice")["output"])

How Chainless Works

ComponentDescriptionKey Features
AgentA self-contained unit powered by an LLM and optional tools- Tool invocation - Custom start hooks - System prompts - Stateless or memory-driven design
ToolA callable Python function with metadata- Reusable - Easily registered - Simple input/output schema
TaskFlowComposable flow engine that connects agents together- Sequential/Parallel execution - Input mapping - Output routing
Custom StartOverride how an agent processes data- Direct LLM calls - Prompt chaining - Arbitrary preprocessing logic

Core Features

Modular Agents

Design lightweight agents that use tools or reason directly. Compose complex logic without over-engineering.

Composable TaskFlows

Connect agents in flexible sequences or parallel blocks. Build entire pipelines by just describing the data flow.

Native Python Tools

Define your own tools with pure Python. Chainless automatically wraps them with metadata for agent usage.

Custom Execution Hooks

Fully control how an agent executes. Inject prompts, chain models, or pre/post-process using decorators.

Zero-Bloat, Server-Free

Keep it local. Chainless doesn’t force you into a runtime. No cloud service, no lock-in — just Python.

LLM Agnostic

Bring your own model — from OpenAI, Groq, DeepSeek, Ollama or others. Chainless is fully compatible.


Next Steps