Large language models (LLMs) and generative AI have taken the world by storm. The natural, conversational experience these bots provide has truly raised the bar. And having seen how human-like AI-powered interactions can be, customers now expect the same in support settings.
One of the most innovative gen AI use cases in customer support is instantly pulling information from your digital help center. Plug this tech into your knowledge base and you’ll be having more accurate and human support conversations in minutes. By connecting an LLM to your knowledge base or FAQ page you can instantly serve the most up-to-date support information to your customers. No bot training needed.
Find out more about UltimateGPT — the generative AI bot to revolutionize your support.
But to see the most value from gen AI bots (like UltimateGPT) it’s essential that the data the LLM has access to is presented as concisely and coherently as possible. To help you get your customer service knowledge base generative AI ready, here are the best practices to follow.
Best practices for structuring your customer service knowledge base
Let’s start with the overall architecture, then drill down into more detailed formatting tips.
Knowledge base architecture best practices
- Delete any repeat articles or conflicting information: your bot is only as accurate as the data it’s trained on, so feed it the most recent and relevant data available
- Make sure articles are hyper-focused and that your knowledge base covers each support topic completely: unlike humans, the bot can’t follow links or navigate to external web pages and support resources to get a better understanding of a topic — all relevant information should be available in your knowledge base
- Use content tags: this is especially important if you want to show different content to different user types — based on country for example (UltimateGPT has a feature coming soon that will allow customers to surface different answers based on user segment)
- Make sure the text-only versions of articles are clear: LLMs only process text, so if your articles contain images or diagrams, these won’t be understood by the model
Article formatting best practices
- Provide a clear structure within each article: use titles, subtitles, and include action-based steps to follow
- Avoid using sub-steps: provide specific instructions to cover each individual use case — for example if there are multiple ways for customers to activate a new bank card, present these instructions separately rather than as sub-steps within a single step
- Include an introduction: this should cover the value and problems solved, to explain why a process or product is a certain way (currently this tip is more to help human readers — but once the contextual awareness of our LLM bot improves, including an overview will become much more relevant)
- Use short paragraphs that answer a question or explain a topic
Keep sentences short and sweet (this also helps with translation)
- Use bullet points for facts or tips, and numbers for steps
- Write terms out in full (with the abbreviation in parentheses) when using them for the first time
Our AI researchers’ number 1 tip? Make sure each article directly answers a customer question. Not only will this help the LLM perform better, it’ll make life easier for your human users too.
As with implementing any new technology, a little bit of prep goes a long way. So get your customer service knowledge base ready before plugging in a generative AI solution. You’ll reap the rewards: faster time-to-value and more accurate automated support.
And finally, it's important to remember that generative AI isn't a silver bullet that can solve all of your support issues. Instead, this technology should be used as part of a broader, well-planned CX strategy: where gen AI, intent-based automation, and (of course) human agents all play to their strengths and work together to deliver the best experience for your customers.