Agent Builder + Flow API

Build AI agents visually.
Run them as APIs.

Compose agents on a drag-and-drop canvas — LLM provider, system prompt, tools, memory, retrieval, agent-to-agent. Deploy with a slug. Call with one HTTP request, OpenAI-SDK compatible.

13+ chat providers OpenAI-compatible base URL SSE / JSONL streaming Always free
Featured product · Always free

Give your AI a lifetime memory.

Your AI loses context between sessions. We fix that. A Kanban board with native MCP integration — every task, decision, and file change is tracked automatically. Your LLM picks up exactly where it left off, across days and projects.

  • Native MCP integration — one bearer token, 19 tools. Works with Claude Code, Cursor, and Windsurf out of the box.
  • Auto-summary pipeline — epics and tickets condense themselves; the project README rewrites itself as work lands.
  • Epics & sub-tickets — every conversation, decision, and file change is captured for future sessions to read.
  • Always free — no quotas, no credit card, no time-limit beta. Just sign in.
Native MCP Auto Summary Lifetime Memory Epics & Sub-tickets Claude Code Cursor Windsurf
project memory · live
Backlog
TRUC-156Migrate auth to JWT
TRUC-152Add CSV import
In Process
TRUC-157Promote feature to homepage
TRUC-141Reasoning input on chat-*
Completed
TRUC-150Demo agents seed
TRUC-134Public flow API
MCP connected · add_comment · get_project_status · create_ticket
Agent Builder

Drag, connect, ship.

Compose agents from typed nodes on a Drawflow-powered canvas. Wire an LLM provider into a system-prompt node, attach tools and a vector store, branch with switch / if-else logic. Save the flow, give it a slug, and you have a callable endpoint.

OpenAI Anthropic Google Gemini Groq xAI Grok OpenRouter Mistral Together DeepSeek Cohere Fireworks Azure OpenAI Ollama
LLM Provider
Anthropic · Sonnet
System Prompt
Support Triage
Memory
Conversation
Tool
search_kb
Switch
intent
Retrieval
vector store
Agent → Agent
Escalate
Custom Python
post_process
Output
JSON shape
Automation

Wire your business logic into agents.

Once the flow has a slug, it runs over HTTP. Three patterns teams build in their first afternoon:

💬

Customer-support routing

Classify incoming tickets, route billing to a finance-RAG agent, route bugs to engineering, escalate the rest. One Switch node, three downstream agents.

📄

Content extraction

Send a PDF or webpage. The agent runs OCR, extracts entities into a typed JSON schema, and writes the result back to your database. Custom-Python node closes the loop.

🔎

Multi-step research

Decompose the query, fan out parallel retrieval against your vector store, summarize per-source, then merge into a single grounded answer. Agent-to-Agent edges orchestrate the rest.

API Docs

One endpoint per agent.

Each saved flow becomes a callable endpoint. Token-authenticated, typed input vars, configurable response shape (raw, OpenAI ChatCompletion, or your own JSON template).

  • OpenAI-compatible — drop-in for any OpenAI client by swapping base_url.
  • Streaming via Server-Sent Events or JSON Lines.
  • Bearer-token auth per organization.
  • Schema-validated input from your flow's Input nodes.
  • Per-agent dynamic docs when you sign in.
View API docs →
curl https://trucopilot.com/api/llm/{org_code}/flows/{slug}/run \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{ "input": { "question": "How do I reset my password?" } }'
const r = await fetch(
  "https://trucopilot.com/api/llm/{org_code}/flows/{slug}/run",
  {
    method: "POST",
    headers: {
      "Authorization": `Bearer ${TOKEN}`,
      "Content-Type": "application/json",
    },
    body: JSON.stringify({ input: { question: "..." } }),
  }
);
const data = await r.json();
import requests

r = requests.post(
    "https://trucopilot.com/api/llm/{org_code}/flows/{slug}/run",
    headers={"Authorization": f"Bearer {TOKEN}"},
    json={"input": {"question": "..."}},
)
print(r.json())
from openai import OpenAI

client = OpenAI(
    base_url="https://trucopilot.com/api/llm/{org_code}/flows/{slug}",
    api_key=TOKEN,
)

resp = client.chat.completions.create(
    model="trucopilot-flow",
    messages=[{"role": "user", "content": "..."}],
)
print(resp.choices[0].message.content)

Replace {org_code} and {slug} with values from your dashboard.

More products

A focused suite, alongside the Agent Builder.

Tools we build because we hit the problem ourselves — image hosting, image generation, hallucination tracing, PHP caching, and the full management dashboard.

Always free

Start building today.

No credit card. No quotas. Sign in with Google and create your first agent in under a minute.

Sign in →