Skip to main content

Prerequisites

  • Bun 1.3.9 or later
  • An API key for your preferred LLM provider (e.g., OpenAI)
  • Node.js 18+ (optional, for Vercel deployment)
Verify your Bun version:
bun --version  # Should be ≥ 1.3.9

Create a New Project

Scaffold a new agent project with create-aixyz-app:
bunx create-aixyz-app my-agent
cd my-agent
bun install
Pass -y to skip interactive prompts and use defaults:
bunx create-aixyz-app my-agent -y

Project Structure

The scaffolded project follows the standard aixyz layout:
my-agent/
  aixyz.config.ts       # Agent metadata, payment config, and skills
  app/
    agent.ts            # Agent definition using Vercel AI SDK
    tools/
      weather.ts        # Example tool implementation
    icon.png            # Agent icon (served as a static asset)
  package.json
  tsconfig.json
  vercel.json           # Vercel deployment config
  .env.local            # Environment variables (API keys)
The aixyz.config.ts file is the central configuration. The app/ directory contains your agent logic, tools, and optional server overrides.

Run the Dev Server

Start the development server with hot reload:
bun run dev
This runs aixyz dev, which watches app/ and aixyz.config.ts for changes and automatically restarts the server. Your agent is available at http://localhost:3000.

Test Your Endpoints

Verify the agent card:
curl http://localhost:3000/.well-known/agent-card.json
Test the A2A JSON-RPC endpoint:
curl -X POST http://localhost:3000/agent \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","method":"tasks/send","id":"1","params":{"id":"task-1","message":{"role":"user","parts":[{"type":"text","text":"Hello"}]}}}'
The MCP endpoint is available at http://localhost:3000/mcp for MCP-compatible clients.

Next Steps