Etles is a powerful, open-source autonomous agent platform built with Next.js and the AI SDK that delegates complex administrative, sales, and operational tasks to a suite of specialized AI sub-agents.
Read Docs · Features · Model Providers · Deploy Your Own · Running locally
- Autonomous Sub-Agents Framework
- Includes 16 specialized agents out of the box (e.g., SDR, Chief of Staff, Project Manager, Incident Responder, Cloud Cost Optimizer).
- Sub-agents operate intelligently out-of-band to perform complex, multi-step actions and proactive delegations.
- Deep Triggers & Automations (via Composio)
- Seamlessly connect to over 100+ platforms (Jira, Slack, Salesforce, Stripe, GitHub).
- Configure active background triggers in real-time, enabling reactive agent workflows.
- Advanced Agent Toolkit
- Memory: Agents save, recall, and update long-term user memories via Upstash Vector.
- Scheduling: Fully conversational cron jobs and reminders powered by Upstash QStash.
- File Storage: Store and retrieve files with Vercel Blob.
- Generative UI: Interactive components natively stream charts, documents, and real-time weather into the chat.
- Next.js App Router
- Edge routing functionality mapping to highly resilient Server Components (RSCs) and Server Actions.
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic chat and generative user interfaces
- Supports OpenAI, Anthropic, Google, xAI, and other model providers via AI Gateway
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Neon Serverless Postgres for saving chat history and user data
- Vercel Blob for efficient file storage
- Auth.js
- Simple and secure authentication
This template uses the Vercel AI Gateway to access multiple AI models through a unified interface. The default model is OpenAI GPT-4.1 Mini, with support for Anthropic, Google, and xAI models.
For Vercel deployments: Authentication is handled automatically via OIDC tokens.
For non-Vercel deployments: You need to provide an AI Gateway API key by setting the AI_GATEWAY_API_KEY environment variable in your .env.local file.
With the AI SDK, you can also switch to direct LLM providers like OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
You can deploy your own version of Chatbot to Vercel with one click:
You will need to use the environment variables defined in .env.example to run Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.
Note: You should not commit your
.envfile or it will expose secrets that will allow others to control access to your various AI and authentication providep accounts.
- Install Vercel CLI:
npm i -g vercel - Link locah instance with Vercel and GitHub accounts (creates
.verceldirectory):vercel link - Download your environment variables:
vercel env pull
pnpm install
pnpm db:migrate # Setup database or apply latest database changes
pnpm devYour app template should now be running on localhost:3000.