Let’s build an agent that can answer questions about Agno, remembers previous conversations, and runs as a production API — all in about 20 lines.
Save the following code as agno_assist.py:
from agno.agent import Agent
from agno.db.sqlite import SqliteDb
from agno.models.anthropic import Claude
from agno.os import AgentOS
from agno.tools.mcp import MCPTools
agno_assist = Agent(
name="Agno Assist",
model=Claude(id="claude-sonnet-4-5"),
db=SqliteDb(db_file="agno.db"), # session storage
tools=[MCPTools(url="https://docs.agno.com/mcp")], # Agno docs via MCP
add_datetime_to_context=True,
add_history_to_context=True, # include past runs
num_history_runs=3, # last 3 conversations
markdown=True,
)
# Serve via AgentOS → streaming, auth, session isolation, API endpoints
agent_os = AgentOS(agents=[agno_assist])
app = agent_os.get_app()
That’s it. This gives you streaming responses, per-user session isolation, and a full API — no extra configuration.
Run your AgentOS
Set up your virtual environment
uv venv --python 3.12
source .venv/bin/activate
Install dependencies
uv pip install -U 'agno[os]' anthropic mcp
Export your Anthropic API key
export ANTHROPIC_API_KEY=sk-***
Run your AgentOS
fastapi dev agno_assist.py
This starts your AgentOS at http://localhost:8000. View the auto-generated API docs at http://localhost:8000/docs. You can add your own routes, middleware, or any FastAPI feature on top.
Connect to the AgentOS UI
The AgentOS UI connects directly to your runtime, letting you monitor, manage, and test your agents.
- Open os.agno.com and sign in.
- Click “Add new OS” in the top navigation.
- Select “Local” to connect to a local AgentOS.
- Enter your endpoint URL (default:
http://localhost:8000).
- Name it something like “Development OS”.
- Click “Connect”.
Once connected, you’ll see your OS with a live status indicator.
Chat with your Agent
Go to Chat in the sidebar, select your Agent, and ask “What is Agno?” — the agent will pull from the Agno docs MCP server to answer.
Click Sessions in the sidebar to view your Agent’s conversations. Data is stored in your local database — no third-party tracing required.
Next