Table of contents
What you need to know about the Model Context Protocol (MCP)
.png)
Anthropic's recently-released Model Context Protocol (MCP) reaffirms that large language models (LLMs) need customer data to provide reliable, personalized, and useful outputs.
The protocol will likely play an invaluable role in helping AI models and 3rd-party applications integrate with one another. There are just a few things to consider when building and maintaining connections with this protocol.
You can read on to learn how MCP works, the benefits it provides, and how unified API solutions can complement it.
What is the Model Context Protocol?
MCP is an open standard protocol released by Anthropic. It allows AI models to connect directly with external data sources so that the models can read data from and write data to the connected applications.
More specifically, MCP includes:
- An MCP client: a component that’s integrated within LLMs and facilitates interactions with external data sources
- An MCP server: a lightweight program that exposes data and functionality from external systems. These systems can come from local data sources, like files and databases, or remote services, such as APIs from applications like Salesforce and Box
- Tools: allow LLMs to access specific data and execute functionality exposed by the MCP server in response to user prompts
.png)
Related: How MCP compares to APIs
Examples of using MCP
Here are just a few ways to leverage MCP.
Power a customer-facing support chatbot
Say you offer a customer-facing AI chatbot that can use several LLMs (e.g., GPT-4o, Claude 3.5, etc.), depending on the support task it’s performing.
To help all of the AI chatbot’s underlying LLMs get the context needed to understand tasks and perform them, you can integrate the LLMs with the relevant support applications via the MCP protocol and give the LLMs access to read and write capabilities across these connected systems.

Support enterprise AI search
Imagine that you offer an AI assistant that can help customers’ employees ask a wide range of questions and receive answers in plain text (using NLP).
To ensure the AI assistant can answer a broad range of questions, you can integrate its underlying LLM with the clients’ file storage systems via MCP. The LLM can then ingest the contents from the documents and not only use these contents to generate outputs but also link out to the documents themselves.

Related: Enterprise AI search use cases
Enable AI agents to act as recruiting coordinators
Now say you want to power AI agents that help customers manage interviews.
More specifically, you want the AI agent to not only remind interviewers about an upcoming interview but also provide context on candidates to help these interviewers prepare quickly, easily, and effectively.
To power this, you can integrate your AI agent’s LLM with your customers’ applicant tracking systems (ATSs) through MCP.
The AI agent can then ingest the information provided in the applications—from resumes to cover letters to Linkedin profiles—allowing it to generate summaries on candidates in a place that’s convenient for interviewers (e.g., Slack).

Benefits of using MCP
The Model Context Protocol offers several benefits that'll help support widespread adoption:
- Simplifies the build process: By providing a single, standard protocol, LLM providers and SaaS applications have a clearer and easier path to integrating with one another
- Supports workflow definitions: It provides a structured way for LLMs to retain, update, and get context, which allows the LLMs to manage and progress workflows autonomously
- Enhances LLM efficiency: By standardizing context management, MCP minimizes unnecessary processing for LLMs
- Strengthens security and compliance: It offers standardized governance over how context is stored, shared, and updated across different environments
Related: Tips for using MCP
What MCP doesn’t address at the moment
Here are a few areas it doesn’t cover:
- Managing rate limits optimally: MCP doesn’t optimize data syncing within an integration provider’s rate limits, requiring you to implement throttling strategies
- Authenticating to an endpoint: MCP doesn’t handle authentication or specify how it should be implemented, leaving that decision to the integration provider
- Handling errors: MCP doesn’t enforce a standardized error-handling framework or response status codes—these are defined by each API provider. That said, MCP allows you to use error messages in JSON RPC 2.0, which includes code, message, and data fields. Learn more here
- Supporting webhooks: The protocol doesn’t include webhooks or event-driven architecture for instant data updates (although they do support real-time syncs via server-sent events)
How unified APIs relate to MCP
Unified API solutions, which let you add hundreds of integrations to your product through a single, aggregated API, complement MCP for any integration, whether that’s managing authentication, data normalization, security, or sync speeds.
Data normalization
A unified API solution normalizes all of the integrated customer data, or converts that data to a predefined data model. This ultimately allows an LLM to handle prompts with more precision.

Security
Unified API solutions can secure your integrations by giving you full control of the customer data you can access and who on your team can access it.
For example, a unified API solution can offer scopes—or the ability for either you or your customers to toggle off the specific fields that customers don’t want you to access and sync.
Observability
Unified API solutions can offer a full suite of integration observability features to help your customer-facing team manage any of your MCP-based integrations. This includes everything from automated issue detection to fully-searchable logs.
Performance
Finally, unified API solutions can support integrations with fast sync speeds. And many support webhooks to sync data in real-time—allowing an LLM to use the latest data for each customer.
Enable your LLM to access all of your customers' data via Merge MCP
We've just released our own MCP server—Merge MCP—to help you access all 220+ of our customer-facing integrations!
You’ll still get access to the rest of our features and capabilities, from our integration maintenance support to our features that let your customer-facing teams manage integrations independently.
You can even implement Merge MCP with a few lines code:
{{this-blog-only-cta}}