/opengraph-image.png)
MCP Chat is a free, open-source chat app built using the AI SDK, and Pipedream MCP, which provides access to nearly 3,000 APIs and more than 10,000 tools. Use this as a reference to build powerful AI chat applications.
Features · Model Providers · Prerequisites · Deploy Your Own · Running Locally
Check out the app in production at chat.pipedream.com and refer to Pipedream's developer docs for the most up to date information.
- MCP integrations: Connect to thousands of APIs through Pipedream's MCP server with built-in auth
- Automatic tool discovery: Execute tool calls across different APIs via chat
- Uses the AI SDK: Unified API for generating text, structured objects, and tool calls with LLMs
- Flexible LLM and framework support: Works with any LLM provider or framework
- Data persistence: Uses Neon Serverless Postgres for saving chat history and user data and Auth.js for simple and secure sign-in
The demo app currently uses models from Anthropic, OpenAI, and Gemini, but the AI SDK supports many more.
To run or deploy this app, you'll need:
- A Pipedream account
- A Pipedream project. Accounts connected via MCP will be stored here.
- Pipedream OAuth credentials
- An OpenAI API key
One-click deploy this app to Vercel:
- Copy the environment file and add your credentials:
cp .env.example .env # Edit with your values
Note that for easier development, chat persistence and application sign-in are disabled by default in the .env.example
file:
# In your .env file
DISABLE_AUTH=true
DISABLE_PERSISTENCE=true
- Install dependencies and start the app:
pnpm install
pnpm dev
Your local app should now be running on http://localhost:3000 🎉
- Run all required local services:
docker compose up -d
- Run migrations:
POSTGRES_URL=postgresql://postgres@localhost:5432/postgres pnpm db:migrate