How To Prep Your Customer Service Help Center for Generative AI

Three people working at table and a window showing a help center start page.

Generative AI is shaking up the world of customer support. Here’s how to ensure your customer service knowledge base is set up for success — to help you see the most value from this game-changing tech.

Large language models (LLMs) and generative AI have taken the world by storm. The natural, conversational experience these bots provide has truly raised the bar. And having seen how human-like AI-powered interactions can be, customers now expect the same in support settings.

One of the most innovative gen AI use cases in customer support is instantly pulling information from a knowledge source like your digital help center. Plug this tech into your knowledge base and you’ll be having more accurate and human support conversations in minutes. By connecting an LLM to your help center (like your Zendesk Knowledge Base) or FAQ page you can instantly serve the most up-to-date support information to your customers. No bot training needed.

Find out more about UltimateGPT — the generative AI bot to revolutionize your support.

But to see the most value from gen AI bots (like UltimateGPT) it’s essential that the data the LLM has access to is presented as concisely and coherently as possible. To help you get your customer service knowledge base generative AI ready, here are the best practices to follow.

Let's start with what actually happens when you feed generative AI a knowledge source, then we'll go over overall help center architecture best practices, and then we'll drill down into more detailed formatting tips.

What happens when a knowledge source is fed to a gen AI-powered bot?

When you feed your knowledge source into a generative AI-powered bot, the text of the knowledge source is intelligently broken down into “chunks” of text. This is the first step in a framework called Retrieval-Augmented Generation or RAG. RAG enables your LLM to access information beyond its original training data, such as your carefully crafted Zendesk help center articles.

These chunks are then stored in a database that’s organized by semantic meaning. When a user message is sent to the AI, the meaning of that message is compared with the meaning of the chunks in the database to surface the best match. That information is then used by your bot — in accordance with its instructions, including tone of voice, safety guardrails, etc. — to answer the user’s message.

How gen AI uses your knowledge base to answer customer questions in more detail: 

  1. Breaking it down: Initially, your knowledge source is imported and segmented into what we call "chunks." These chunks vary in size, tailored to capture both the length and the intrinsic meaning of your content.
  2. Understanding through numbers: Each chunk then receives its unique numerical signature—a vector representing the semantic meaning of the chunk. Essentially, it’s translating your text into a mathematical language that the AI can understand and store efficiently in a vector database.
  3. Matching wits: When a user poses a question to your bot, the system compares the semantic meaning of the question with these stored vectors to find the best match. This process ensures that the most relevant chunks are retrieved to provide precise and informed answers.
  4. Et voilà: Finally, your AI-powered bot uses the retrieved information to answer the user’s query, in accordance with its instructions, guardrails, and Bot Persona settings like tone of voice and answer length.

By understanding the main point here — that chunks are the basis of your bot’s replies — you can better prepare your knowledge sources to be more compatible and effective when integrated into your gen AI bot.

And while LLMs and RAG  are at the forefront of today's technological advancements, capturing well-deserved attention with their innovative capabilities, we don’t have Artificial General Intelligence (that is, AI that can carry out all tasks that a human can) just yet. So as you integrate gen AI into your workflow, remember that it draws its insights entirely from the text chunks created from your connected knowledge sources rather than browsing or carrying out research in the background.

Pro tips for formatting your knowledge base from our in-house AI researchers

Before we get into general best practices, here’s our AI researchers’ top two tips for prepping your help center:

  1. Make sure each article directly answers a customer question. Not only will this help the LLM perform better, it’ll make life easier for your human users too, when they search through your help center for an answer.
  2. Align questions and topics with their solutions. If the question (e.g., "How do I XYZ?") or topic (e.g., "Steps to perform XYZ") appears only in the title, it may not always remain attached to its corresponding answers or instructions during the chunking process. So to keep the context clear, it’s a smart move to repeat the question or the key statement near the steps or information in the body of the article. This helps ensure that each chunk of the article is comprehensive and remains useful on its own.

This practice not only keeps the question linked to its answer within the chunk but also improves the likelihood that the retrieval system will present complete and contextually accurate responses.

Knowledge base architecture best practices for UltimateGPT

If you're using our generative AI-powered bot, UltimateGPT and you want it to function at its peak, it's crucial to refine the architecture of your knowledge base. Here are some straightforward guidelines to get you started:

  1. Eliminate redundancies: Sift through your content and remove any duplicate or conflicting information. Remember, the accuracy of the bot hinges on the quality of the data it receives. Always prioritize the most recent and pertinent content.
  2. Depth and focus: Structure your articles to be hyper-focused. Each support topic should be thoroughly covered within your digital help center. Unlike humans, UltimateGPT cannot browse external web pages or follow links to gather additional context; hence, it's vital that all necessary information is self-contained.
  3. Tag your content: Implement content tags. This becomes particularly beneficial when you want to tailor content visibility based on user attributes, like geographical location, using UltimateGPT's Search Rules feature.
  4. Opt for text: Ensure there is a text-only version of every article. Since UltimateGPT interprets text and saves its meaning into a database, anything in your help center that’s not text — images, videos, diagrams — will neither be read nor saved in the database.

Formatting your articles for optimal clarity

An organized structure can significantly enhance the accessibility and usability of your content. Remember, each chunk is stored and retrieved based on its meaning, so the clearer that is, the better.

  1. Use clear hierarchy: Use titles and subtitles effectively, and structure the content with action-oriented steps. As per the pro tip above, avoid separating topics or questions from their answers.
  2. Avoid nested instructions: If multiple solutions exist for a problem, present each as a separate instruction rather than sub-steps within a broader step. This clarity will aid both your users and the LLM in finding solutions quickly.
  3. Include introductions: Each article should begin with an introduction that outlines the relevance and the problems it aims to solve. This is beneficial for human users currently, and will become increasingly relevant for LLMs as their contextual understanding evolves.
  4. Keep it simple: Keep paragraphs short, focused on answering specific questions directly or explaining topics concisely. Likewise, sentences should be direct and to the point—this also aids in better translations.
  5. Structure lists: Use bullet points to list facts or tips, and number your steps when detailing a process. Worth noting: although text-based tables (i.e. actual text, not an image) can be read by a bot like UltimateGPT, typically they are harder for an LLM to understand and assign meaning to than information laid out in normal sentences — so it’s better to stick to natural language everywhere you can.
  6. Clarify terminology: Always spell out terms in full with the abbreviation in parentheses when first introduced. This ensures clarity for all users.
  7. Know your audience: If your help center — or a part of it — has been created for your own agents rather than a customer-facing resource then consider whether the information needs to be rewritten in a way that’s appropriate for your customer service bot to deliver to your customers. 

As with implementing any new technology, a little bit of prep goes a long way. So get your customer service knowledge base ready before plugging in a generative AI solution. You’ll reap the rewards: faster time-to-value and more accurate automated support.

And finally, it's important to remember that a generative AI help center bot isn't a silver bullet that can solve all of your support issues. Instead, this technology should be used as part of a broader, well-planned CX strategy: where gen AI, intent-based automation, and (of course) human agents all play to their strengths and work together to deliver the best experience for your customers.

Ready to revolutionize your support with gen AI?