Table of contents
Add hundreds of integrations to your product through Merge’s Unified API
Get a demo

Building AI chatbots with RAG: a step-by-step guide

Jon Gitlin
Senior Content Marketing Manager
@Merge

As large language models (LLMs) continue to improve, organizations see increasing value in incorporating AI chatbots into their products.

AI chatbots can now address customer issues effectively; answer employees’ questions with high precision; help employees build robust analytics dashboards quickly—and much more.

The process of building these chatbots, however, isn’t always straightforward. 

To help guide you through the process, we’ll break down several steps you should take. But first, let’s align on what an AI chatbot is and why it’s important.

What is an AI chatbot?

It’s a chat interface that leverages RAG to answer users’ questions or take actions on their behalf. It can be used across your product and in immeasurable ways, as long as it can access all of the relevant data.

For example, Assembly, a comprehensive HR software platform, offers DoraAI, an AI chatbot that can answer employees’ questions in plain text based on the files it has access to. DoraAI can also display the source(s) it used in its answer in case the user wants to learn more.

A screenshot of DoraAI, an AI chatbot for employees

Related: What is an AI product strategy?

Why businesses need AI chatbots

While the reasons depend on the business and their unique requirements, here are some that apply to many companies:

  • Addresses users’ questions quickly: AI chatbots can address prompts in a matter of seconds (if not faster). This is not only a great customer experience but also allows users to tackle tasks and/or get issues resolved quickly (which helps explain why chatbots are often used to improve the customer experience)
  • Increases conversion rates on leads: By using AI chatbots to surface leads to the right reps, reps can reach out sooner, which—according to a prominent study—should help them convert leads at a higher rate
  • Improves employee engagement: AI chatbots can tackle many repetitive tasks that your employees would otherwise have to perform. This allows your employees to save countless time and focus on the work they’re more likely to enjoy
  • Cost savings: Since AI chatbots can complete a wide gamut of tasks effectively, they can lower headcount both now and in the future, saving companies countless money
  • Competitive differentiation: Many companies in your space may not have adopted AI chatbots in their product, or at least not in the ways your team is scoping. If you can be the first mover in incorporating AI chatbots for a certain use case, you can earn and maintain a meaningful competitive advantage 

The step-by-step process of building an AI chatbot

Here’s a breakdown of each step you’ll need to take.

1. Define your objective

Your objective will be key for executing on the rest of the steps, so it’s critical to establish it at the beginning.

It requires several inputs: the core purpose of your chatbot; the KPI(s) you want it to influence; and the audience you want to use it on (at least initially). You’ll also want to incorporate a goal-setting framework—e.g., SMART—so that the goal is actionable.

For example, Assembly can set the following goal for DoraAI, their AI chatbot for employees:

DoraAI’s goal can be to help all of its customers’ employees get every question answered. They can measure its impact by whether their customers’ average Net Promoter Score® improves by at least 5 points, on average, within 3 months.

‍‍‍‍

Related: A guide to setting product objectives

2. Choose a chatbot platform 

You’ll need to decide between an open-source chatbot framework, such as Rasa or Botpress, and a cloud-based platform, like Dialogflow, Ema, and AmazonQ.

The best choice depends on a few factors:

  • Customizability: If you have highly custom requirements, you may need to use an open-source chatbot framework, as it offers more flexibility in how it can be integrated with applications, the workflows it can support, and more
  • Scalability: If you plan to roll out the chatbot to a large audience (or you eventually plan to do so), it may be worth using a cloud-based platform as they handle the infrastructure on your behalf
  • Time to market: If your team needs to push the chatbot live as soon as possible, the cloud-based platform will likely be a better choice, as its implementation is likely simpler and easier
  • Resources: The process of building and maintaining chatbots can be fairly technical, and your team either may not have the resources on hand to manage this project or can’t afford to allocate them to it. This would force your team to use a cloud-based platform
  • Connectivity: If the chatbot needs to be integrated with several applications, it may be worth using a cloud-based platform. These platforms typically offer pre-built connections and may offer additional support for building new connections (e.g., technical teams on their side can add integrations on your behalf).

In short, there’s no one-size-fits-all answer, but you’ll likely want to use a cloud-based platform. 

3. Design the conversation flow

Once you’ve picked the solution, you’ll need to craft the conversation path based on all of the user inputs that can come up. You’ll also need to consider potential edge cases in user inputs and how the chatbot should respond to them.

Fortunately, this step is getting increasingly easier with the help of natural language processing (NLP). 

Many tools let you use plain text to describe the workflow you want an AI chatbot to support and the tool can use an LLM to then generate that workflow on your behalf. You can even iterate on it in plain text, helping you land on the most optimal workflow.

A screenshot of a chatbot from Ema
Ema, a universal AI employee, lets you build AI agents and chatbots through simple descriptions

4. Train your AI chatbot

Your AI chatbot ultimately needs to access your customers’ latest data to provide personalized and actionable outputs.

For example, it can’t help a customer support rep tackle a specific client issue if it can’t access and interact with the customer support rep’s ticketing system.

To that end, you’ll need to build API-based integrations (as opposed to file-based integrations, which are slower and error prone) with all of the systems that support your chatbot’s use cases.

A screenshot of the integrations that Ema offers
Since Ema’s AI chatbots and agents can help employees across departments, their platform offers integrations with several software categories—from file storage to ticketing systems to CRMs

You’ll also need to fine tune the chatbot based on user inputs; so while the chatbot may not perform well at the beginning, it should consistently improve.

Related: Ideas for AI products

5. Secure your AI chatbot 

The integrations that support your chatbot should include access control levels (ACLs), which ensure that users only access the data that meets their level of permissions in that integrated application. 

For example, if a user is asking your chatbot for sensitive business information, the chatbot can determine that that user doesn’t have access to the file that contains the information—leading your chatbot to not share it.

6. Test and monitor the chatbot

Before pushing the chatbot to production, your team should test it with a wide range of inputs, including the edge cases. You should also share it with colleagues who are your target user profiles who haven’t spent as much time working on the chatbot. They can approach it with a fresh pair of eyes and see issues you can’t. 

Finally, your team should also monitor the chatbot soon after it goes live as small issues can have a long-term impact on how it gets perceived. 

Best practices for creating an AI chatbot

As you build your AI chatbot, it’s worth keeping the following in mind:

  • Create a chatbot personality that reflects your brand values. Your chatbot is often the first line of communication for prospects, and your customers and employees will likely interact with it often. As a result, it’s critical that the chatbot interacts with users in a way that reflects your brand’s core values—whether that’s humility, integrity, kindness, and so on

  • Provide links to the sources used in its outputs. Whether fair or not, LLMs have gained a reputation for hallucinating, or generating inaccurate outputs.

To give users confidence that your AI chatbot provides accurate information, you can include the sources used in a given output and make it clickable—allowing users to verify the information and learn more.

Example of DoraAI linking out to a source
DoraAI includes sources in every response it generates, allowing users to not only verify information but also access the documents they need quickly
  • Offer suggested prompts by default. Many users may not know what your AI chatbot is capable of. They also might not think of the most valuable ways to use it. To help guide them in the right direction, you can offer suggestions and even provide brief descriptors of what your chatbot can do (this is especially helpful when it’s the user’s first time using your chatbot)

  • Use webhooks when possible. Webhooks allow your chatbot to access and use real-time data when generating outputs. This can prove critical for several use cases, from sharing warm leads to sales reps to sharing security incidents

  • Leverage a unified API solution. A unified API solution offers a single, aggregated API to help you add hundreds of integrations. Since your customers’ tech stacks likely vary, and you may need to support different categories of integrations to cover all of your chatbot’s use cases, this can help you take your chatbot to market faster
A screenshot of how Merge works
Merge lets you add 220+ product integrations across 6 software categories
  • Feed normalized data to the LLM that powers your chatbot. Normalized data, or data that’s transformed to fit a predefined data model, can be embedded more accurately, as it doesn’t include miscellaneous information. This allows it to be identified for relevant prompts consistently, leading it to be fed to the AI chatbot’s LLM and used in outputs 
How normalized data leads to better outputs
An AI chatbot can use normalized data to answer prompts—like “Give me the marketing team’s first names and email addresses”—successfully

Related: Best practices for using retrieval-augmented generation

Power best-in-class AI chatbots with Merge

Merge, the leading unified API solution, lets you add hundreds of HRIS, ATS, file storage, ERP, ticketing, and CRM integrations through a single build. 

Merge also:

  • Normalizes all of the integrated data across its integrations according to its Common Models
  • Offers Integration Observability features to help your customer-facing teams manage any integration issues
  • Supports webhooks to help your chatbot use real-time data

Learn more about how Merge can support your AI chatbots by scheduling a demo with one of our integration experts.

“It was the same process, go talk to their team, figure out their API. It was taking a lot of time. And then before we knew it, there was a laundry list of HR integrations being requested for our prospects and customers.”

Name
Position
Position
Jon Gitlin
Senior Content Marketing Manager
@Merge

Jon Gitlin is the Managing Editor of Merge's blog. He has several years of experience in the integration and automation space; before Merge, he worked at Workato, an integration platform as a service (iPaaS) solution, where he also managed the company's blog. In his free time he loves to watch soccer matches, go on long runs in parks, and explore local restaurants.

Read more

AI product strategy: key steps and best practices to follow

AI

Building AI chatbots with RAG: a step-by-step guide

AI

What you need to know about the Model Context Protocol (MCP)

AI

Subscribe to the Merge Blog

Get stories from Merge straight to your inbox

Subscribe

But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text
But Merge isn’t just a Unified 
API product. Merge is an integration platform to also manage customer integrations.  gradient text