A plain-English walkthrough of everything: what AgentOS is, what you can actually build with it, and how every part of the platform works — no experience required.
For technical details jump to Quick Start or Primitives.
AgentOS is a backend platform for AI agents. Think of it as a cloud operating system that gives any AI agent — whether it lives inside Claude, GPT, a custom bot, or your own code — a real set of tools it can use:
You connect to AgentOS with a single API key. Every tool call goes through one endpoint: /mcp. No separate SDKs, no complicated setup.
Go to /signup. Enter your email address and a name for your agent. That's it.
You'll get back:
agent_abc123...)Lost your key? Sign in at /signin to generate a new bearer token from your browser session.
curl -s -X POST https://agentos-app.vercel.app/api/signup \
-H "Content-Type: application/json" \
-d '{"email":"you@example.com","agentName":"My Agent"}' | jq{
"credentials": {
"agentId": "agent_abc123...",
"apiKey": "eyJhbGciOiJIUzI1NiJ9...",
"expiresIn": "90 days"
}
}# .env AGENT_OS_KEY=eyJhbGciOiJIUzI1NiJ9...
const AGENT_OS_URL = 'https://agentos-app.vercel.app';
const API_KEY = process.env.AGENT_OS_KEY;
async function mcp(tool, input) {
const res = await fetch(AGENT_OS_URL + '/mcp', {
method: 'POST',
headers: {
Authorization: 'Bearer ' + API_KEY,
'Content-Type': 'application/json',
},
body: JSON.stringify({ tool, input }),
});
const data = await res.json();
if (!res.ok) throw new Error(data.error || 'Agent OS error');
return data.result;
}All examples below use this mcp() helper.
After you have your API key, you have access to 6 primitives (categories of tools) and 30+ individual tools. Here's the plain-English breakdown:
Store key-value data in a fast cache. Perfect for remembering context between conversations, sessions, or API calls. Data can expire automatically (TTL).
mem_setmem_getmem_deletemem_listYour agent gets 1 GB of private file storage. Write reports, logs, exports, generated images, CSVs — anything.
fs_writefs_readfs_listfs_deleteA private PostgreSQL-compatible database. Create tables, insert rows, run queries. Every agent gets an isolated schema — no shared data.
db_create_tabledb_insertdb_querydb_updatedb_deleteCall any external API from your agent. Fetch live crypto prices, weather, news headlines, send webhooks, hit your own backend.
net_http_getnet_http_postnet_http_putnet_http_deleteRun Python or JavaScript in a sandboxed environment. Parse data, run calculations, transform formats — anything code can do.
proc_executeproc_scheduleproc_spawnproc_killSend real-time messages between agents or services. Trigger workflows, broadcast updates, coordinate multi-agent tasks.
events_publishevents_subscribeevents_list_topicsBelow are complete, working examples of things real people build on AgentOS.
Every 5 minutes, fetch the live BTC price and store it. If the price drops more than 5% from the last stored high, log an alert to a file.
// 1. Fetch live price from a public API
const price = await mcp('net_http_get', {
url: 'https://api.coincap.io/v2/assets/bitcoin',
});
const currentPrice = parseFloat(price.body.data.priceUsd);
// 2. Read the last recorded high from memory
const lastHigh = parseFloat(await mcp('mem_get', { key: 'btc_high' }) ?? '0');
// 3. Update the high if needed
if (currentPrice > lastHigh) {
await mcp('mem_set', { key: 'btc_high', value: String(currentPrice) });
}
// 4. Alert if price dropped >5% from high
const drop = ((lastHigh - currentPrice) / lastHigh) * 100;
if (drop > 5) {
const alert = `[ALERT] BTC dropped ${drop.toFixed(1)}% from $${lastHigh.toFixed(0)} to $${currentPrice.toFixed(0)}`;
const existing = await mcp('fs_read', { path: '/alerts.log' }) ?? '';
await mcp('fs_write', {
path: '/alerts.log',
data: btoa(existing + '\n' + new Date().toISOString() + ' ' + alert),
});
console.log(alert);
}Every time you research a topic, store key facts so your agent remembers them in future conversations — no matter what AI model you're using.
// Save a research note
await mcp('mem_set', {
key: 'research:solana-tps',
value: 'Solana handles ~65,000 TPS theoretically, ~4,000 sustained in production as of Q1 2026.',
ttl: 86400 * 30, // remember for 30 days
});
// Later — retrieve it
const note = await mcp('mem_get', { key: 'research:solana-tps' });
// → 'Solana handles ~65,000 TPS theoretically...'
// List all research notes
const allNotes = await mcp('mem_list', { prefix: 'research:' });
// → [{ key: 'research:solana-tps', ... }, ...]Pull data from an external API, run a Python analysis on it, and save the output as a formatted report file — entirely in one agent run.
// 1. Fetch data
const response = await mcp('net_http_get', {
url: 'https://api.coingecko.com/api/v3/coins/markets?vs_currency=usd&order=market_cap_desc&per_page=10',
});
const coins = response.body;
// 2. Analyse with Python
const analysis = await mcp('proc_execute', {
language: 'python',
code: `
import json, sys
coins = ${JSON.stringify(coins)}
report_lines = []
for c in coins:
pct = c.get('price_change_percentage_24h', 0)
arrow = '▲' if pct > 0 else '▼'
report_lines.append(f"{c['symbol'].upper():8} ${c['current_price']:>12,.2f} {arrow} {abs(pct):.2f}%")
print('\n'.join(report_lines))
`,
timeout: 10000,
});
// 3. Save as file
const reportText = `Top 10 Coins — ${new Date().toUTCString()}\n` +
`${'-'.repeat(40)}\n` + analysis.stdout;
await mcp('fs_write', {
path: '/reports/crypto-daily.txt',
data: btoa(reportText),
});
console.log('Report saved:', reportText);Agent A completes some work and publishes an event. Agent B is subscribed and immediately picks it up — like a task queue, but for AI agents.
// Agent A — publisher (when work is done)
await mcp('events_publish', {
topic: 'tasks.completed',
payload: {
task_id: 'task_123',
result: { status: 'success', output: 'analysis done' },
agent_id: 'agent_A',
},
});
// Agent B — subscriber (listening continuously)
const events = await mcp('events_subscribe', {
topic: 'tasks.completed',
limit: 10,
});
for (const event of events) {
console.log('Agent B received task result:', event.payload);
// ...process the result
}A fully autonomous agent that monitors your X mentions, auto-replies with context, schedules posts at peak hours, and logs engagement to a DB — runs 24/7 without you.
// Run this on a cron every 5 minutes
const AGENT_ID = process.env.AGENT_ID;
// 1. Fetch unseen mentions (stored cursor in memory)
const cursor = await mcp('mem_get', { key: 'x:last_mention_id' }) ?? '0';
const mentions = await mcp('net_http_get', {
url: `https://api.twitter.com/2/users/${AGENT_ID}/mentions?since_id=${cursor}&max_results=10`,
headers: { Authorization: 'Bearer ' + process.env.X_BEARER_TOKEN },
});
const tweets = mentions.body?.data ?? [];
for (const tweet of tweets) {
// 2. Generate a reply using context from memory
const persona = await mcp('mem_get', { key: 'x:persona' })
?? 'Helpful, technical, direct. Max 2 sentences.';
const reply = await mcp('proc_execute', {
language: 'javascript',
code: `
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: { 'x-api-key': process.env.ANTHROPIC_API_KEY, 'anthropic-version': '2023-06-01', 'content-type': 'application/json' },
body: JSON.stringify({ model: 'claude-haiku-4-5-20251001', max_tokens: 100,
messages: [{ role: 'user', content: 'Reply to this tweet (persona: ${persona}): ${tweet.text}' }] }),
});
const d = await res.json();
return d.content[0].text;
`,
});
// 3. Post the reply
await mcp('net_http_post', {
url: 'https://api.twitter.com/2/tweets',
headers: { Authorization: 'Bearer ' + process.env.X_BEARER_TOKEN, 'Content-Type': 'application/json' },
body: JSON.stringify({ text: reply.stdout, reply: { in_reply_to_tweet_id: tweet.id } }),
});
// 4. Log to DB
await mcp('db_insert', { table: 'x_replies', data: { tweet_id: tweet.id, reply: reply.stdout, replied_at: new Date().toISOString() } });
}
// 5. Save cursor so we don't re-process
if (tweets.length > 0) await mcp('mem_set', { key: 'x:last_mention_id', value: tweets[0].id });
// 6. Schedule a post if it's peak hour (9am, 12pm, 6pm UTC)
const hour = new Date().getUTCHours();
if ([9, 12, 18].includes(hour)) {
const nextPost = await mcp('db_query', {
sql: "SELECT content FROM scheduled_posts WHERE posted = false ORDER BY created_at ASC LIMIT 1",
});
if (nextPost[0]) {
await mcp('net_http_post', {
url: 'https://api.twitter.com/2/tweets',
headers: { Authorization: 'Bearer ' + process.env.X_BEARER_TOKEN, 'Content-Type': 'application/json' },
body: JSON.stringify({ text: nextPost[0].content }),
});
await mcp('db_update', { table: 'scheduled_posts', where: { content: nextPost[0].content }, data: { posted: true } });
}
}Five specialized agents coordinate a full marketing campaign: one writes copy, one posts to X, one handles Reddit, one tracks metrics, one optimizes based on results. They communicate via events.
// ── AGENT 1: Copywriter ─────────────────────────────────────
// Generates campaign copy and publishes to the swarm
const topic = await mcp('mem_get', { key: 'campaign:topic' }); // e.g. "AgentOS v3.2 launch"
const copy = await mcp('proc_execute', {
language: 'javascript',
code: `
const res = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: { 'x-api-key': process.env.ANTHROPIC_API_KEY, 'anthropic-version': '2023-06-01', 'content-type': 'application/json' },
body: JSON.stringify({
model: 'claude-sonnet-4-6',
max_tokens: 500,
messages: [{ role: 'user', content:
'Write 3 variations of marketing copy for: ${topic}\n' +
'1. X post (max 280 chars, punchy)\n' +
'2. Reddit post (technical, with code snippet)\n' +
'3. Email subject line (urgency + benefit)\n' +
'Return as JSON: { x, reddit, email }' }],
}),
});
return (await res.json()).content[0].text;
`,
});
const variations = JSON.parse(copy.stdout);
// Broadcast to all agents via events
await mcp('events_publish', {
topic: 'campaign.copy_ready',
payload: { ...variations, topic, campaign_id: 'launch_v32', ts: Date.now() },
});
// ── AGENT 2: X Poster ────────────────────────────────────────
// Listens for copy_ready and posts to X
const events = await mcp('events_subscribe', { topic: 'campaign.copy_ready', limit: 1 });
if (events[0]) {
const { x: text, campaign_id } = events[0].payload;
const tweet = await mcp('net_http_post', {
url: 'https://api.twitter.com/2/tweets',
headers: { Authorization: 'Bearer ' + process.env.X_BEARER_TOKEN, 'Content-Type': 'application/json' },
body: JSON.stringify({ text }),
});
// Report metrics back
await mcp('events_publish', {
topic: 'campaign.metric',
payload: { channel: 'x', campaign_id, tweet_id: tweet.body?.data?.id, posted_at: new Date().toISOString() },
});
}
// ── AGENT 3: Reddit Poster ───────────────────────────────────
const redditEvents = await mcp('events_subscribe', { topic: 'campaign.copy_ready', limit: 1 });
if (redditEvents[0]) {
const { reddit: body, campaign_id } = redditEvents[0].payload;
// Post to relevant subreddit via Reddit API
await mcp('net_http_post', {
url: 'https://oauth.reddit.com/api/submit',
headers: { Authorization: 'Bearer ' + process.env.REDDIT_TOKEN, 'User-Agent': 'AgentOS/1.0' },
body: JSON.stringify({ sr: 'artificial', kind: 'self', title: 'AgentOS v3.2 drops today', text: body }),
});
await mcp('events_publish', { topic: 'campaign.metric', payload: { channel: 'reddit', campaign_id } });
}
// ── AGENT 4: Metrics Tracker ─────────────────────────────────
// Aggregates results from all channels into DB
const metrics = await mcp('events_subscribe', { topic: 'campaign.metric', limit: 50 });
for (const m of metrics) {
await mcp('db_insert', {
table: 'campaign_metrics',
data: { ...m.payload, recorded_at: new Date().toISOString() },
});
}
const summary = await mcp('db_query', {
sql: "SELECT channel, COUNT(*) as posts FROM campaign_metrics WHERE campaign_id = $1 GROUP BY channel",
params: ['launch_v32'],
});
await mcp('mem_set', { key: 'campaign:launch_v32:summary', value: JSON.stringify(summary) });
// ── AGENT 5: Optimizer ───────────────────────────────────────
// Reads metrics, decides what to double-down on
const campaignSummary = JSON.parse(await mcp('mem_get', { key: 'campaign:launch_v32:summary' }) ?? '[]');
const best = campaignSummary.sort((a, b) => b.posts - a.posts)[0]?.channel;
if (best) {
// Tell copywriter to generate more content for winning channel
await mcp('events_publish', {
topic: 'campaign.optimize',
payload: { action: 'boost', channel: best, reason: 'highest_engagement' },
});
}Give your agent a real SQL database to store structured data across sessions — customer records, task history, logs, anything.
// 1. Create a table once (safe to call multiple times — checks first)
await mcp('db_create_table', {
table: 'conversations',
schema: [
{ column: 'id', type: 'uuid', primaryKey: true },
{ column: 'user_id', type: 'text', nullable: false },
{ column: 'message', type: 'text', nullable: false },
{ column: 'role', type: 'text', nullable: false },
{ column: 'created_at', type: 'timestamptz', nullable: false },
],
});
// 2. Insert a message
await mcp('db_insert', {
table: 'conversations',
data: {
id: crypto.randomUUID(),
user_id: 'user_42',
message: 'What is the weather in London?',
role: 'user',
created_at: new Date().toISOString(),
},
});
// 3. Query conversation history for a user
const history = await mcp('db_query', {
sql: 'SELECT role, message, created_at FROM conversations WHERE user_id = $1 ORDER BY created_at DESC LIMIT 20',
params: ['user_42'],
});
console.log('History:', history);Skills are pre-built capabilities you can install and call instantly. Instead of writing code to parse PDFs, translate text, or process images — install a skill and call it with one line.
Browse at /marketplace. Skills are free or usage-based (you pay per call).
await fetch('https://agentos-app.vercel.app/api/skills/install', {
method: 'POST',
headers: {
Authorization: 'Bearer ' + API_KEY,
'Content-Type': 'application/json',
},
body: JSON.stringify({ skill_id: '<skill-uuid-from-marketplace>' }),
});const result = await fetch('https://agentos-app.vercel.app/api/skills/use', {
method: 'POST',
headers: {
Authorization: 'Bearer ' + API_KEY,
'Content-Type': 'application/json',
},
body: JSON.stringify({
skill_slug: 'json-transformer',
capability: 'filter',
params: { array: myData, key: 'status', value: 'active' },
}),
}).then(r => r.json());
console.log(result.result); // filtered dataconst { installed_skills } = await fetch('https://agentos-app.vercel.app/api/skills/installed', {
headers: { Authorization: 'Bearer ' + API_KEY },
}).then(r => r.json());
installed_skills.forEach(s => console.log(s.skill.name, s.skill.slug));The Studio is a browser-based terminal. Sign in at /signin and you can run any tool directly in your browser — no code required.
If you build something useful on top of AgentOS, you can publish it as a skill on the marketplace. Other agents can install and call your skill — and you receive 70% of all usage revenue.
Skill. Each method corresponds to a capability.class Skill {
// capability: "summarise"
summarise({ text, maxWords = 50 }) {
const words = text.trim().split(/\s+/);
return {
result: words.slice(0, maxWords).join(' ') + (words.length > maxWords ? '...' : ''),
wordCount: words.length,
truncated: words.length > maxWords,
};
}
// capability: "wordCount"
wordCount({ text }) {
return {
result: text.trim().split(/\s+/).length,
};
}
}The Ops console is for platform administrators. It shows the autonomous crew — a set of AI agents that maintain continuous coverage of every feature and function on the platform.
Every platform capability has an active agent and a standby agent. If the active agent degrades or fails, the standby automatically takes over (failover).
Coverage stateWhether a feature has both an active and standby agent assigned. "Fully covered" means both slots are healthy.
Health scoreA 0–1 score for each agent. Below ~0.5 triggers a triage suggestion; below ~0.2 triggers automatic failover.
FailoverWhen the active agent degrades, the standby automatically becomes active. You can also trigger this manually from the UI.
Cron cycleA periodic health check run. Hits every active agent, generates suggestions for degraded ones, and performs failovers where needed.
BootstrapCreates missing active/standby pairs for any uncovered features. Safe to run at any time.
FFP (Furge Fabric Protocol) is an optional decentralised consensus layer for critical financial operations. When enabled, any agent trying to call a sensitive domain (Binance, Coinbase, Stripe, PayPal, etc.) must get approval from the FFP network before the request is allowed through.
For most users, FFP is not needed. It is designed for high-stakes multi-agent deployments where you want a second layer of verification before money moves.
See the full setup guide: FFP documentation →
You now know everything you need to build on AgentOS. Sign up, grab your API key, and start with the one-liner below.
// Your first Agent OS call — store anything
await mcp('mem_set', { key: 'hello', value: 'world', ttl: 3600 });
const v = await mcp('mem_get', { key: 'hello' });
console.log(v); // 'world'