Table of contents

Add hundreds of integrations to your product through Merge’s Unified API
Get a demo

What you need to know about the Model Context Protocol (MCP)

Jon Gitlin
Senior Content Marketing Manager
@Merge

Anthropic's recently-released Model Context Protocol (MCP) reaffirms that large language models (LLMs) need customer data to provide reliable, personalized, and useful outputs.

The protocol will likely play an invaluable role in helping AI models and 3rd-party applications integrate with one another. There are just a few things to consider when building and maintaining connections with this protocol.

You can read on to learn how MCP works, the benefits it provides, and how unified API solutions can complement it.

What is the Model Context Protocol? 

MCP is an open standard protocol released by Anthropic. It allows AI models to connect directly with external data sources so that the models can read data from and write data to the connected applications.

More specifically, MCP includes:

  • An MCP client: a component that’s integrated within LLMs and facilitates interactions with external data sources
  • An MCP server: a lightweight program that exposes data and functionality from external systems. These systems can come from local data sources, like files and databases, or remote services, such as APIs from applications like Salesforce and Box
  • Tools: allow LLMs to access specific data and execute functionality exposed by the MCP server in response to user prompts
How MCP works

Related: How MCP compares to APIs

Examples of using MCP

Here are just a few ways to leverage MCP.

Power a customer-facing support chatbot 

Say you offer a customer-facing AI chatbot that can use several LLMs (e.g., GPT-4o, Claude 3.5, etc.), depending on the support task it’s performing. 

To help all of the AI chatbot’s underlying LLMs get the context needed to understand tasks and perform them, you can integrate the LLMs with the relevant support applications via the MCP protocol and give the LLMs access to read and write capabilities across these connected systems.

Using the MCP protocol, you can build integrations that allow LLMs to access and use customers’ ticketing data

Support enterprise AI search

Imagine that you offer an AI assistant that can help customers’ employees ask a wide range of questions and receive answers in plain text (using NLP). 

To ensure the AI assistant can answer a broad range of questions, you can integrate its underlying LLM with the clients’ file storage systems via MCP. The LLM can then ingest the contents from the documents and not only use these contents to generate outputs but also link out to the documents themselves.

A screenshot of DoraAI, an AI chatbot for employees
AI assistant providers—like Assembly’s DoraAI—can connect their LLM to customers’ file storage systems to answer employees’ questions in plain text

Related: Enterprise AI search use cases

Enable AI agents to act as recruiting coordinators

Now say you want to power AI agents that help customers manage interviews. 

More specifically, you want the AI agent to not only remind interviewers about an upcoming interview but also provide context on candidates to help these interviewers prepare quickly, easily, and effectively. 

To power this, you can integrate your AI agent’s LLM with your customers’ applicant tracking systems (ATSs) through MCP. 

The AI agent can then ingest the information provided in the applications—from resumes to cover letters to Linkedin profiles—allowing it to generate summaries on candidates in a place that’s convenient for interviewers (e.g., Slack).

A screenshot from a Nova agent that helps an interviewer prep for their upcoming calls
Recruiting coordinator AI agents—like Peoplelogic’s “Noah” agent—can use customers’ ATS data to generate summaries on candidates

Benefits of using MCP

The Model Context Protocol offers several benefits that'll help support widespread adoption:

  • Simplifies the build process: By providing a single, standard protocol, LLM providers and SaaS applications have a clearer and easier path to integrating with one another
  • Supports workflow definitions: It provides a structured way for LLMs to retain, update, and get context, which allows the LLMs to manage and progress workflows  autonomously
  • Enhances LLM efficiency: By standardizing context management, MCP minimizes unnecessary processing for LLMs
  • Strengthens security and compliance: It offers standardized governance over how context is stored, shared, and updated across different environments

Related: Tips for using MCP

What MCP doesn’t address at the moment

Here are a few areas it doesn’t cover:

  • Managing rate limits optimally: MCP doesn’t optimize data syncing within an integration provider’s rate limits, requiring you to implement throttling strategies
  • Authenticating to an endpoint: MCP doesn’t handle authentication or specify how it should be implemented, leaving that decision to the integration provider
  • Handling errors: MCP doesn’t enforce a standardized error-handling framework or response status codes—these are defined by each API provider. That said, MCP allows you to use error messages in JSON RPC 2.0, which includes code, message, and data fields. Learn more here 
  • Supporting webhooks: The protocol doesn’t include webhooks or event-driven architecture for instant data updates (although they do support real-time syncs via server-sent events)

How unified APIs relate to MCP

Unified API solutions, which let you add hundreds of integrations to your product through a single, aggregated API, complement MCP for any integration, whether that’s managing authentication, data normalization, security, or sync speeds.

Data normalization

A unified API solution normalizes all of the integrated customer data, or converts that data to a predefined data model. This ultimately allows an LLM to handle prompts with more precision.

How normalized data leads to better outputs
An LLM can use normalized data to answer prompts—like “Give me the marketing team’s first names and email addresses”—successfully

Security

Unified API solutions can secure your integrations by giving you full control of the customer data you can access and who on your team can access it.

For example, a unified API solution can offer scopes—or the ability for either you or your customers to toggle off the specific fields that customers don’t want you to access and sync.

Observability

Unified API solutions can offer a full suite of integration observability features to help your customer-facing team manage any of your MCP-based integrations. This includes everything from automated issue detection to fully-searchable logs.

Performance

Finally, unified API solutions can support integrations with fast sync speeds. And many support webhooks to sync data in real-time—allowing an LLM to use the latest data for each customer.

Enable your LLM to access all of your customers' data via Merge MCP

We've just released our own MCP server—Merge MCP—to help you access all 220+ of our customer-facing integrations!

You’ll still get access to the rest of our features and capabilities, from our integration maintenance support to our features that let your customer-facing teams manage integrations independently.

You can even implement Merge MCP with a few lines code:

{
  "mcpServers": {
    "merge-mcp": {
      "command": "uvx",
      "args": ["merge-mcp"],
      "env": {
        "MERGE_API_KEY": "xxxxxxx",
        "MERGE_ACCOUNT_TOKEN": "xxxxxxx"
      }
    }
  }
}

{{this-blog-only-cta}}

“It was the same process, go talk to their team, figure out their API. It was taking a lot of time. And then before we knew it, there was a laundry list of HR integrations being requested for our prospects and customers.”

Name
Position
Position
Jon Gitlin
Senior Content Marketing Manager
@Merge

Jon Gitlin is the Managing Editor of Merge's blog. He has several years of experience in the integration and automation space; before Merge, he worked at Workato, an integration platform as a service (iPaaS) solution, where he also managed the company's blog. In his free time he loves to watch soccer matches, go on long runs in parks, and explore local restaurants.

Read more

3 insider tips for using the Model Context Protocol effectively

We’ve launched Merge MCP to help AI companies leverage our integrations in minutes! Here’s how to use it

Company

MCP vs API: how to understand their relationship 

AI

Subscribe to the Merge Blog

Get stories from Merge straight to your inbox

Subscribe

Want to learn more about Merge MCP?

Learn more about Merge MCP and how it can power your AI product by scheduling a demo with one of our integration experts.

Schedule a demo
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text