Skip to main content

Chattermill MCP Server

Learn how to connect Chattermill to your AI agents via MCP and surface customer feedback insights right where you work.

Written by Sophie McLeman
Updated today

The Chattermill MCP Server allows AI agents (such as Claude, Cursor, and Codex) to securely access your customer feedback. Once configured, your teams can query Chattermill insights directly within the AI agent they already use - asking questions in natural language and receiving answers instantly.

This guide explains how to connect an AI agent to the Chattermill MCP Server.


What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open standard that allows AI agents to securely connect to external applications, agents, and platforms.

Without MCP, AI agents can only work with information you manually paste into a chat or with public data. MCP enables AI agents to query your business data directly, while respecting existing permissions and security controls.

For Chattermill users, this means customer feedback insights from Chattermill can be accessed directly inside the AI agents they already use, without switching platforms or building custom integrations.

Figure: How AI agents access Chattermill through the MCP Server.


Supported AI Agents

The Chattermill MCP Server can be used with any AI agent that supports the Model Context Protocol (MCP).

The table below lists AI agents we’ve tested with the Chattermill MCP Server. For setup instructions, refer to each agent’s MCP documentation (linked below).

Agent

Authentication

Notes

OAuth

Requires a paid Claude plan

OAuth

CLI-based workflow

OAuth

IDE-based workflow

OAuth

Requires ChatGPT Plus or above

OAuth

Developer CLI

OAuth

Browser-based agent

OAuth

CLI-based agent

OAuth

MCP-compatible agent

OAuth

AI agent platform for building workflows

API Key

Business & Enterprise plans

API Key

MCP-compatible agent

Custom AI agents

Depends on agent

Any MCP-compatible implementation

If you need help connecting a specific AI agent, feel free to contact Chattermill support.

Known Limitations

Some AI agents have restrictions that affect MCP connectivity:

  • ChatGPT requires a paid plan (Plus or above) to enable MCP connectors in Developer Mode.

  • Microsoft Copilot (consumer) does not currently support MCP.

  • Microsoft 365 Copilot supports MCP only through administrator-deployed agents.

These limitations are imposed by the AI clients, not the Chattermill MCP Server.


What The Chattermill MCP Server Does (Examples)

The Chattermill MCP Server allows AI agents to securely access and analyze your customer feedback data.

By exposing Chattermill’s analysis capabilities through the MCP, the server enables them to retrieve insights, summaries, metrics, and raw feedback using natural language queries.

See how Product Teams can use the Chattermill MCP to pull customer feedback insights directly into their AI agents.

See how CX teams can use the Chattermill MCP to surface customer feedback insights directly in AI Agents like Claude.


The Chattermill MCP Tools and Capabilities

The Chattermill MCP Server exposes MCP tools that allow AI agents to discover available data and retrieve customer feedback insights from Chattermill.

These tools fall into two groups: Discovery tools and Data tools.

Discovery tools

list_themes

Returns the categories and themes configured in a Chattermill project.

AI agents can use this to understand how feedback is organized before running deeper analysis.

list_attributes

Returns all attributes that can be used to filter feedback data, including built-in and custom segments such as country, product, or data source.

search_attributes

Searches for values within a specific attribute. This allows AI agents to discover valid filter values before applying them in queries. It’s used when an attribute has a high volume of possible values.

Example use cases:

  • Find available values for a specific attribute (e.g. country or product)

  • Discover valid filter inputs before querying feedback

search_observations

Searches clusters of related customer feedback using keywords or theme identifiers.

Results include observation identifiers and representative snippets that can be used for further analysis.

Example use cases:

  • Find feedback clusters related to a specific topic

  • Identify observations linked to a theme before retrieving full feedback

list_metric_options

Returns available metric configuration options that can be used with the get_metrics tool.

This includes available chart types, metrics, frequencies, and breakdown options supported by Chattermill. AI agents can use this tool to discover how quantitative data can be queried before requesting metrics.

Data tools

generate_highlights

Generates an AI-powered summary of customer feedback based on a query, in the exact same style you’d find in the Chattermill app. This tool helps identify key issues, trends, or themes within a dataset.

get_metrics

Returns structured quantitative data from Chattermill, such as NPS trends, sentiment metrics, or feedback volume over time.

AI agents can use this data to analyze trends or generate charts.

get_feedback

Retrieves individual customer feedback responses matching a query or filter.

This allows AI agents to analyze verbatim feedback and validate insights using real customer comments.


How The Chattermill MCP Server Works

At a high level, the integration works as follows:

  1. A user asks a question in an MCP-compatible AI agent (such as Claude, Cursor, Codex, or a custom agent).

  2. The AI agent analyzes the request and determines how to answer it, selecting from available tools, including the Chattermill MCP Server.

  3. When customer feedback data is needed, the agent invokes the appropriate Chattermill MCP tool with a natural-language query.

  4. The Chattermill MCP Server securely executes the request on Chattermill’s infrastructure and retrieves the relevant data.

  5. Structured results are returned to the AI agent.

  6. The agent may use these results directly or combine them with other tools and additional queries to generate a complete response.

This process allows the AI agent to retrieve accurate, structured insights from Chattermill while maintaining Chattermill’s existing data permissions and security controls.


Security and Permissions

The Chattermill MCP Server operates within Chattermill’s existing security and permission framework:

  • Chattermill roles and permissions apply

  • Users can only access data they already have permission to view

  • Users explicitly choose to connect their AI agent to Chattermill

  • All communication occurs over HTTPS

  • Data analysis runs on Chattermill infrastructure

The MCP Server does not grant access to any data beyond what a user can already view in Chattermill.


Example Prompts

The following example prompts can be used to confirm that your AI agent is correctly connected to the Chattermill MCP Server.

  • Summarize feedback
    Example: Summarize what customers are saying about the checkout experience.

  • Explore themes and patterns
    Example: Find feedback clusters related to a new feature.

  • Answer quantitative questions
    Example: What was our NPS last year compared to the previous year?

  • Analyze trends over time
    Example: Which customer issues increased in Germany this week versus last week?


How to Configure The Chattermill MCP Server

The Chattermill MCP Server is hosted and managed by Chattermill. AI agents act as MCP clients and implement their own configuration methods.

While the MCP server is compatible with any MCP-enabled AI agent, setup steps vary depending on the client. The examples below show how to configure several commonly used and recommended agents.

Authentication

The Chattermill MCP Server supports two authentication methods, depending on the capabilities of your AI agent:

1. OAuth (recommended)

OAuth allows users to securely connect their AI agent to Chattermill using their existing Chattermill account.

When connecting to the MCP server for the first time, your AI agent will open a browser window prompting you to log in to Chattermill.

Authentication takes place on the Chattermill login page at:

auth.chattermill.com

Once authenticated, the AI agent can retrieve insights from your Chattermill workspace based on your existing permissions.

2. Bearer token (API key)

Some AI agents support authentication using a bearer token. In this case, users can authenticate by providing their Chattermill API key as a token when configuring the MCP connection.

Setup Instructions

The examples below show how to configure several commonly used AI agents. For all other supported agents, refer to their documentation for instructions on connecting to a remote MCP server.

We recommend setting all Chattermill MCP tools to “Always Allow” in your AI agent, as these tools are read-only.


Claude

MCP connectors are supported in the Claude web app.

For Teams and Enterprise plans (recommended):

1. Go to: https://claude.ai/admin-settings/connectors (admin access required)

2. Add a new connector using the MCP server URL: https://app.chattermill.com/mcp

3. Open Claude

4. Go to Connectors

5. Select the Chattermill connector

6. Authenticate via the browser when prompted

Alternative: Individual setup

If organisation-level setup is not available:

2. Click + Add custom connector

3. Enter the MCP server URL: https://app.chattermill.com/mcp

4. Save the connector

5. Authenticate via the browser when prompted


Claude Code

For organizations using Claude Team or Enterprise plans, the MCP connector can be installed at the organization level.

Administrators can install it at:

Once installed, users can connect by:

  1. Opening Claude Code

  2. Running:

/mcp

3. Selecting the Chattermill connector

4. Authenticating in the browser window

Alternative setup

If organization-level installation is not available, run:

claude mcp add --transport http --scope user --callback-port 60830 Chattermill https://app.chattermill.com/mcp

Then run:

/mcp

Select Chattermill and authenticate.


Cursor

Cursor supports connecting to remote MCP servers directly from the settings interface.

Steps:

  1. Open Cursor

  2. Navigate to Settings → MCP Servers

  3. Add a new MCP server

  4. Enter the server URL https://app.chattermill.com/mcp

  5. Authenticate in the browser window.


Other MCP-Compatible Agents

The Chattermill MCP Server can be used with any AI agent that supports the Model Context Protocol (MCP). Configuration steps vary depending on how each client implements MCP.

Refer to your AI agent’s documentation for instructions on connecting to a remote MCP server using the Chattermill MCP endpoint, or contact our support team.


Confirming the Configuration

Once configured, try a simple query in your AI agent, for example:

Summarize what customers are saying about the checkout experience.

If the connection is successful, your AI agent will retrieve insights from your Chattermill workspace.

If the configuration does not work as expected, please reach out to Chattermill support for assistance.


FAQ and Troubleshooting

These answers cover common questions about configuring and using the Chattermill MCP Server. For agent-specific configuration steps, refer to your AI agent’s documentation.

Q: Do I need a paid AI agent to use the Chattermill MCP Server?

Not necessarily. Many MCP-compatible AI agents support MCP servers, including free and paid plans.

Availability of MCP features depends on the specific AI agent you are using. Refer to their documentation for details on MCP support and plan requirements.

Q: What data can the AI agent access through the Chattermill MCP Server?

The AI agent can only access data you already have permission to view in Chattermill.

This includes customer feedback data, such as Highlights (summaries), Observations (feedback clusters), quantitative insights (such as NPS and sentiment), and individual feedback entries.

The Chattermill MCP respects existing Chattermill roles, permissions, and data

access rules. It does not grant access to additional data.

Q: Where does data processing happen?

Data processing happens across both the AI agent and Chattermill’s infrastructure. The AI agent interprets the user’s request, selects the appropriate MCP tool, and processes the results returned by Chattermill.

When customer feedback data is required, the Chattermill MCP Server securely executes the request on Chattermill’s servers and returns structured results to the AI agent. This ensures that customer feedback data remains securely stored and processed within Chattermill’s infrastructure, while allowing the AI agent to retrieve and present insights.

Q: Can I use the Chattermill MCP Server across multiple Chattermill projects?

Yes. By default, the MCP uses the project you most recently visited in the Chattermill application.

If you work across multiple projects, you can specify a project ID in your prompt to control which project is queried.

Q: Do I need to host or run the Chattermill MCP Server myself?

No. The Chattermill MCP Server is fully hosted and managed by Chattermill. You only need to configure your AI agent to use it.

Q: Which AI agents are supported?

The Chattermill MCP Server supports any AI agent that implements the Model Context Protocol (MCP).

We have tested and validated the MCP server with a range of commonly used AI agents, including Claude, Cursor, Claude Code, ChatGPT, Codex, Mistral Le Chat, Gemini CLI, Manus, Notion AI, and others.

Refer to the list above for supported agents and their authentication methods.

Q: Why don’t I see Chattermill insights in my AI agent?

Ensure the MCP configuration has been added correctly and that your API key is valid. After updating your configuration, restart your AI agent.

If the issue persists, refer to your AI agent’s documentation or contact Chattermill Support.

Q: Can I use the Chattermill MCP Server behind a corporate firewall?

Yes, provided your network allows HTTPS access to Chattermill’s API. Some enterprise environments may require network or proxy configuration. Contact your IT team if access is restricted.

Q: Where can I find my Chattermill API key?

You can generate and manage API keys in your Chattermill dashboard under Settings → API.

API keys can be used for authentication in AI agents that require header-based (bearer token) authentication, such as Notion AI.

Please note that API key access is currently restricted to Chattermill admins.

Q: Where can I get help if something isn’t working?

If you are experiencing issues, first verify your API key and MCP configuration. For agent-specific configuration questions, refer to your AI agent’s documentation.

If you need further assistance, contact Chattermill support.

Q: Will more agents be supported in the future?

The Chattermill MCP Server works with AI agents that support the Model Context Protocol (MCP) and allow connections to external MCP servers using an API key.

As more AI agents adopt MCP, they will be able to access Chattermill through the MCP Server. Some agents, such as ChatGPT, already support MCP but may require specific plans to use external MCP servers.

In other cases, MCP may only be available in certain versions of a product. For example, Gemini CLI supports MCP, while the consumer Gemini web app does not currently support MCP connections.

Refer to the documentation for your chosen AI agent to confirm MCP compatibility and requirements.


What’s Next?

The Chattermill MCP Server is the foundation for more advanced, agentic workflows. Additional capabilities and supported agents will be added over time.

Did this answer your question?