Skip to main content
Imagine an agent that sits quietly in a group chat until someone calls on it. Ask @helper what’s our refund policy? and the agent responds with the correct info, without interrupting the ongoing human conversation.

What You’ll Build

  • A Mastra agent that can participate in group conversations.
  • Triggered only when explicitly mentioned (e.g., @agent).
  • Can fetch knowledge, trigger tools, or summarize discussions.
  • Optional: integrate into CometChat group chats.

Prerequisites

  • A Mastra project (npx create-mastra@latest my-mastra-app).
  • Node.js installed.
  • OpenAI API key in .env as OPENAI_API_KEY.
  • A CometChat app with group chat enabled.

Step 1

Create the Agent

src/agents/group-chat-agent.ts:
import { openai } from '@ai-sdk/openai';
import { Agent } from '@mastra/core/agent';

export const groupChatAgent = new Agent({
  name: 'Group Chat Agent',
  instructions: `
You are a helpful assistant in a group chat. 
- Only respond when explicitly mentioned with @agent.
- Keep answers short and relevant.
- If unclear, say "I don’t know" rather than guessing.
  `,
  model: openai('gpt-4o-mini'),
});

Step 2

Register the Agent in Mastra

src/mastra/index.ts:
import { Mastra } from '@mastra/core/mastra';
import { PinoLogger } from '@mastra/loggers';
import { LibSQLStore } from '@mastra/libsql';

import { groupChatAgent } from '../agents/group-chat-agent';

export const mastra = new Mastra({
  agents: { 'group-chat': groupChatAgent }, // API path: /api/agents/group-chat/*
  storage: new LibSQLStore({ url: 'file:../mastra.db' }),
  logger: new PinoLogger({ name: 'Mastra', level: 'info' }),
});

Step 3

Run the Agent

rm -rf .mastra/output
npx mastra dev
You should see:
Mastra API running on port http://localhost:4111/api
Test it locally:
curl -X POST http://localhost:4111/api/agents/group-chat/generate   -H "Content-Type: application/json"   -d '{"messages":[{"role":"user","content":"@agent what is our refund policy?"}]}'
Expected output:
{ "reply": "Refunds are available within 30 days of purchase." }

Step 4

Deploy & Connect

  • Deploy the API (/api/agents/group-chat/generate) using Render, Railway, Vercel, or any host.
  • In CometChat Dashboard → AI Agents, create an agent with:
    • Provider: Mastra
    • Agent ID: group-chat
    • Deployment URL: public endpoint from your host
Now the agent will join group chats but only answer when mentioned.

Troubleshooting

  • Agent talks too much: Tighten instructions to only respond when invoked.
  • Doesn’t respond to mentions: Ensure input message contains @agent keyword.
  • Not visible in chat: Verify the agent is added as a user in CometChat and enabled in the Dashboard.

Next Steps

  • Extend the agent with tools (e.g., summarize-discussion, fetch-policy).
  • Add guardrails so the agent only responds to whitelisted topics.
  • Use in combination with a Multi-agent Orchestration (relay) agent to query multiple humans/agents.