Quickstart
Build your first AI agent with Chainless in under 2 minutes.
Build Your First Chainless Agent
This quickstart will show you how to build an AI agent with Chainless, use a tool, and orchestrate execution with TaskFlow.
Before starting, install Chainless and set up your preferred LLM provider.
If you have not installed it yet, follow the installation guide.
Chainless works with multiple LLM providers such as OpenAI, Anthropic, and Gemini.
In this example we configure OpenAI.
import os
os.environ["OPENAI_API_KEY"] = "sk-***" export OPENAI_API_KEY=sk-*** setx OPENAI_API_KEY sk-***Tools are regular Python functions wrapped with metadata so agents can call them.
from chainless import Tool
@Tool.tool(
name="web_search",
description="Returns a simulated web search result."
)
def web_search(query: str):
return f"Simulated search result for: {query}"Agents use LLMs, tools, prompts and structured outputs to perform tasks.
from chainless import Agent
from chainless.models import ModelNames
agent = Agent(
name="ResearchAgent",
model=ModelNames.OPENAI_GPT_4O,
tools=[web_search],
system_prompt="Use the web_search tool whenever the user asks for information."
)Invoke the agent directly using run() or run_async().
result = agent.run("Search the latest AI breakthroughs.")
print(result.output)TaskFlow allows multi agent logic, routing, and orchestration.
from chainless import TaskFlow
flow = TaskFlow("AI Research Flow", verbose=True)
flow.add_agent("Research", agent)
flow.step(
"Research",
input_map={"input": "{{input}}"}
)
result = flow.run("Give me updates about robotics research.")
print(result.output)